Spark Dev will not compile

I had a problem with Spark Dev compelling once then doing absolutely nothing when I hit the compile check symbol. It did not provide error messages or compile. It was mute. Digging around and experimenting I thought I had solved the problem by ensuring that the program file was in its own like-named directory and with no other files.

It would have been nice to get error messages, but that requirement was in the docs, so…

After organizing the files/directories I was able to do compiles (last night). Now, this morning, even with the program file in its own folder I am back to the same behavior. It compiled once, then will not compile again. It just does nothing when I hit the compile “check” symbol.

Frustrating both that its happening and that I have no idea why.

Have you checked if your internet connection is up and running?

Since the compile takes place in the cloud this is crucial for it to work.
If your inet was gone and you tried to compile there and then, you might have to shut down and restart Spark Dev.

It has before also proved useful for us to have a full screenshot of your Spark Dev, to visually get clues about possible issues just by looking at it.

The internet is up and running. It has to be for me to post :smile: I can also select the spark device which is not connected to this computer via USB, and see it pulsing blue, so that encourages me to conclude the internet is up.

Attached is the screen shot. It has a small program that I’m just adding a Spark.function to. It is not done so probably has coding errors. Hitting the compile button does not change anything you see in the screen shot. No messages appear. Nothing. Let me know if you see something.

Nope, can't see any reason why it wouldn't work :flushed:

But that's not that much of a proof :wink:

SInce I'm in Europe and you wrote of this morning, I assumed you did post a while after you experienced the problems - completely ignoring that you might live in a different time zone :wink:

On the other hand, just to make sure - you have already shut down and restarted your Spark Dev.
If you still have got the same issue, you might want to try compiling via CLI, just to confirm that you do have connection to the "compile in cloud" bit of the Spark infrastructure.

Got a message from Spark saying they are releasing an updated Spark Dev today, and to use an older version for now. Spark has confirmed the problem.

Monday, 5 / 18 at noon. Sorry to report, the update, if it was posted, did not resolve my problem. I get one partial compile, (compile in progress but no success message), then after that no activity at all in response to my click of the compile button. Here is the code. Very short, very simple. I know it has errors. I know it has errors. I’m not looking to have the errors reported in a reply. The Dev should flag those, not be mute…

Anyone else want to try compiling this code with TODAY’s version of Particle Dev?

Anyone else want to try compiling this code with TODAY’s version of Particle Dev? If you put this code through Particle Web IDE it will spit a bunch of obvious errors. The point is just that the Dev has some bad behavior in the face of coding errors.

int led0 = D0;  // You'll need to wire an LED to this one to see it blink.
int led7 = D7; // This one is the built-in tiny one to the right of the USB jack
int ain0 = A0; // the light sensor
int ain1 = A1; // The temperature sensor
int A0val = 0;
int A1val = 0;
int tempF = 0;
int lightLevel = 0;
char state = "LOW";


// This routine runs only once upon reset
void setup() {
  // Initialize D0 + D7 pin as output
  // It's important you do this here, inside the setup() function rather than outside it or in the loop function
  Spark.variable("Light", &A0val, INT);
  Spark.variable("Temp", &A1val, INT);
  Spark.function("lightLed0", state);
  pinMode(led0, OUTPUT);
  pinMode(led7, OUTPUT);
  pinMode(ain0, INPUT);
  pinMode(ain1, INPUT);

}

// This routine gets called repeatedly, like once every 5-15 milliseconds.
// Spark firmware interleaves background CPU activity associated with WiFi + Cloud activity with your code.
// Make sure none of your code delays or blocks for too long (like more than 5 seconds), or weird things can happen.
void loop() {
  A0val = analogRead(ain0);
  A1val = analogRead(ain1);
  lightLevel = lightLed0(state)
  digitalWrite(led7, LOW);
  delay(500);               // Wait for 1000mS = 1 second
  digitalWrite(led7, HIGH);
  delay(500);               // Wait for 1 second in off mode
}

int lightLed0(String, state)
{
  if state == "HIGH" {
  digitalWrite(led7, HIGH);
} else {
  digitalWrite(led7, LOW);
}

  return status_code;
}

Sorry to report that my Spark Dev 0.0.21 on Win 8.1 Pro does build and report errors as expected.

OK< Sorry, I should have given the OS and version. I am trying to compile in Particle Dev 0.0.23 and running in Mac OS X Yosemite Version 10.10.3.

So, if anyone can get the current version of the Particle Dev compiler to produce errors rather than “do nothing”, on Mac OS X with the code I posted above, it would be good to know. Either the Dev is not working, or I have some machine specific issue. I’ve done all the obvious things like completely removing older versions of Dev, not using stored files, but creating new files by cut and pasting text (code), and also, of course, verifying a web connection by having a browser open and interleaving compiles of the exact same code on the Particle Web IDE.

The suggestion that the new release would cure this prolem came from @suda, so perhaps he’ll have a go at compiling that code and report on the outcome :smile:

Hi @Bendrix, when I tried to compile code you pasted in Particle Dev 0.0.23 on same OS as you do I got 12 errors: https://www.dropbox.com/s/27ld45jyay89x0x/Screenshot%202015-05-19%2011.11.21.png?dl=0

You could try logging out and in again. If it doesn’t help could you share a screenshot of your Particle Dev?

Hey @suda,

Now it gets even more weird. I logged out and logged back in. Loaded the code from above into spark dev. Hit compile and got the compiling in the cloud message, then… Success!

What?

I opened Spark Build and copied the code right out of Dev and pasted it. Hit compile and got… Lots of errors.

So now I have a Spark Dev that takes code that won’t compile on Build, and tells me it compiled successfully and created a firmware file. Screenshot attached.

Your SparkDevTest2 file has no extension. Spark Dev will only include .c, .cpp, .h and .ino files. Binary which was the result is probably doing nothing or contains default Tinker app.

There should be an error that no files were included though. Created a ticket: https://github.com/spark/spark-dev/issues/85

Added the .c extension and got errors. Arrghh! Was that it all along?

Here is the doc for naming in Particle Build

Create: You can create a new application by clicking the "Create New App" button. Give it a sweet name and press enter! Your app is now saved to your account and ready for editing.

And here is the doc for naming in Particle Dev

"Before compiling your project, make sure your project files are in a dedicated directory. Notes: If other files not related to your project are present in the project directory, you may experience errors when trying to compile."

"All the files have to be on the same level (no subdirectories) like this"

"Compile buttonTo compile your current project, click on the Compile in the cloud button. If your code doesn't contain errors, you'll see a new file named PLATFORM_firmware_X.bin in your project's directory (where PLATFORM is name of currently selected platform and X is a timestamp)."

Particle Dev docs do not mention that you have to provide an extension to save the file. In Build, you don't. It appears to slap on the .ino extension for you. When users go from one IDE to another and basic behavior changes, its going to generate confusion.

If Dev will save the file with no extension, and the docs are not going to make it clear that you must include one in the name (uncommon requirement), the the error you generate should not say "

It should say, "Please add a file extension to this filename. It will not compile without one." And it should say that even if the docs are complete, cause that's what the problem is.

:wink:

If you were going to try to make the Spark IDE as easy to use as the Adruino IDE, where many of your future customers may be coming from, and using simultaneously, please consider having the save button:

  • Fix the name so there are no spaces, or anything else that corrupts
    the compiler

  • Add the extension

  • Create the Folder of the same name

Currently, the File, Save As dialog does not allow you to create a folder on the fly, so assuming you know what you must do, you have to open finder and create the folder before you can save, or use the create folder dialog.

And now I am having a new problem. A program that will compile in Build without errors, and flash to the core, and run as expected, will not compile in Dev. I get 13 errors. Here is the code:

int led0 = D0;  // You'll need to wire an LED to this one to see it blink.
int led7 = D7; // This one is the built-in tiny one to the right of the USB jack
int ain0 = A0; // the light sensor
int ain1 = A1; // The temperature sensor
int A0val = 0;
int A1val = 0;
int tempF = 0;
int statusCode = 0;
String arguments;



// This routine runs only once upon reset
void setup() {
   Spark.variable("Light", &A0val, INT);
   Spark.variable("Temp", &A1val, INT);
   Spark.function("lightLed0", lightLed0);
   pinMode(led0, OUTPUT);
   pinMode(led7, OUTPUT);
   pinMode(ain0, INPUT);
   pinMode(ain1, INPUT);

}

// This routine gets called repeatedly, like once every 5-15 milliseconds.
// Spark firmware interleaves background CPU activity associated with WiFi + Cloud activity with your code.
// Make sure none of your code delays or blocks for too long (like more than 5 seconds), or weird things can happen.

void loop() {
   A0val = analogRead(ain0);
   A1val = analogRead(ain1);


   digitalWrite(led7, LOW);
   delay(500);               // Wait for 1000mS = 1 second
   digitalWrite(led7, HIGH);
   delay(500);               // Wait for 1 second in off mode
}


 int lightLed0(String args) {
     if (args == "0") digitalWrite(led0, LOW);
     if (args == "1") digitalWrite(led0, HIGH);
     int statusCode = 1;
     return statusCode;
 }

I cannot screenshot the the compile errors as drop down disappears when i try to use the GrabIt program. I also can’t find a show error log option, and if you click under Help, Documentation, you get:
404 Not Found

Code: NoSuchKey
Message: The specified key does not exist.
Key: dev/index.html
RequestId: 0E2549B66426372F
HostId: p41CO7fYxFwNrlJVTKOlpxUZGj7Xs5/jwASrFMzPVr/fDhR3eWP97LhDI3Xl5JvV

I took a photo of the first few errors

AND THEN…

On a lark, I changed the extension to .ino, rather than .c Now the program compiles. Does that make sense to @suda? Should that be how Particle Dev works?

If so, I have a few questions.

  • Are there certain libraries, headers, imports or whatever that are specific to the extension? So depending on the extension you use, your program may or may not compile?

  • Does Particle and the Cloud IDE and the various pin names like D0 require or suggest a particular extension be used in order to compile the program?

  • Same as above, is there a preferred, recommended or required extension to use when creating firmware for the Core or Photon?

There is soooooo much undocumented stuff going on with Dev…

If you’ve called your file *.C or *.CPP, you’d need to add a line of #include "application.h".
If you’ve called it *.INO the proprocessor does this for you - this is also the case in the INO file the Web IDE opens for you by default.

1 Like

.ino files are in Wiring, Arduino’s C++ dialect and go thru a pre compiler which makes it Arduino compatible. To access constants/types like A0 or String, you have to #include "application.h" and have your functions to be declared before being used. It was announced some time ago but you’re right, it should be in our documentation.

Edit: Wow, microseconds after @ScruffR :smile:

2 Likes

Hi guys,

Thanks to you both for clearing this up. Its the sort of thing that once you know it, you know it. I’ll stick with .ino since that does some of the overhead for you.

If you fix the doc, the next newbie to come along won’t bother you with questions :blush:

2 Likes

Hi, till now I have been using the Web IDE and I was quite happy with it.
But now I want to use an Arduino library (FPS_GT511C3 modified by @peekay123) for the fingerprint scanner, which is not (yet) in the Web IDE.

My first objective is to try out the communication between the Fingerprint scanner and a Photon.
After that works, I want to try the other sketches…

Therefore, I installed Particle Dev today, trying to follow the (not so clear) instructions as closely as possible:
I have created a program directory called “FPS_Blink” and put all my project files in it:
FPS_Blink.ino, FPS_GT511C3.cpp and FPS_GT511C3.h

Then I opened the “FPS_Blink.ino” sketch and hit the “Compile in the cloud” item in the Particle menu.
After 10s working, I get an error:


Uncaught TypeError: Cannot read property ‘replace’ of undefined
/Applications/Particle Dev.app/Contents/Resources/app.asar/node_modules/spark-dev/lib/views/compile-errors-view.js:51
Show Stack Trace
The error was thrown from the spark-dev package. Atom is out of date: 1.0.19 installed; 1.4.0 latest. Upgrading to the latest version may fix this issue.


(Sorry, I can’t post any screenshots yet…)

So, based on the last line, which includes a hyperlink, I download the latest version of ATOM for Mac.
After downloading it, I start that application and then it looks like this is another generic version of Particle Dev…
No help!

I am looking into the darkness…
Can anyone shine a light on this please?

(I’m not a programmer but a hardware engineer with some programming background…)

OK, I found how to upload images… :wink:
Here is a Particle Dev screenshot showing the message when I try to compile:

Try to open the folder again, but this time, open the FPS_blink folder directly from the menu. Don’t use the tree navigation, but open the actual folder.

1 Like