File stuck in the compiler?

I feel like I’ve already used up my quota of newbie questions, but here’s another one: So I’m using spark dev (atom). I started by trying to compile a simple program (first.cpp). I got some errors (see other post). I then closed that and opened a new file (digitalRead.cpp), added some code to it, saved it and tried to compile. I also got some errors. Then I closed that file and re-opened the original file (first.cpp) but when I try to compile it, I get errors from the second file (digitalRead.cpp). I even closed and re-launced the editor, opened first.cpp and tried to compile again. It once again reported errors from the other file (digitalRead.cpp). So it looks like digitalRead.cpp is somehow stuck in the compiler? How can I fix this?

It’d be helpful if you could post those error logs so we know what we’re dealing with. Saying ‘I got some errors’ doesn’t really tell us a lot.
Just a wild guess, but do you have separate directories for your files? I believe you could have only one project per directory, if I’m not mistaken (@dave, @kennethlimcp, could you confirm this?)

1 Like

Thanks for the reply.

There are two reasons why I have concluded that one of my files was “stuck” in the compiler.

  1. The errors returned when compiling file1 referred to elements that only existed in file2 (e.g. file2 referred to “D0” and “D1” but file1 didn’t. The compiler returned errors about these being undeclared even though file1 was the only file open in the editor.).
  2. The reason for all the errors (in both file1 and file2) was the lack of #include “application.h”. However, I tried adding that when compiling last night and it had no effect. Therefore, I conclude that the compiler wasn’t looking at my updated file.

The files and errors aren’t really very important here, but if you’re interested, the contents of both files can be found in this post: http://community.spark.io/t/solved-serial-is-undeclared/8537

Has anyone else seen this behavior of the cloud compiler in spark dev? If not then, for now, I’ll chalk it up to beginner’s luck and move on…

This just repro’d for me.

I wrote a skeleton program:

#include "application.h"

void setup()
{

}

void loop()
{
  
}

…and selected “compile in cloud”, and got this:

As you can see. The errors are from an “ino” file that has nothing to do with the file that is open in the editor. Surely I can’t be the only one hitting this. Is there a way to workaround it?

(btw, feel free to move this to troubleshooting if that’s more appropriate.)

It sounds like it’s including all the selected files, or the current project in the build, what are your thoughts on this @suda?

Thanks,
David

Exactly. @gorsat can you move your code into separate directory and open it in Spark Dev?

I tried that already. It didn’t help. They’re in different folders, e.g.

c:\projects\Spark\digitalRead\digitalRead.cpp
and
c:\projects\Spark\second\second.cpp

But I just discovered that if spark dev is open in a lower level folder, then I get the error. In other words, this works just fine:

but this causes the error:

This seems like a bug. I’d like to navigate through my folders in spark dev, open a file and compile it without worrying about it compiling some random file in a parallel folder. Am I looking at it the wrong way?

I had similar issues with the Spark Dev compiler. However, in the end I could work around these issues. Some things that helped me:

  • Restart the Spark Dev
  • Hit Save to make sure all the data is saved BEFORE compiling
  • Deleting the local compiled firmware file
  • Removing all files that have nothing to do with the project from the folder.

Your problem with the subfolders indeed seems like a bug, but I did not yet have the “luck” coming around such issue…