Compiling folder using spark-cli cloud compile

Hi,

I’ve just started using spark-cli to flash my cores (and compile my projects) instead of using the web IDE. I’d like to compile a folder though, as I want to include files, but it seems that requires me to use a makefile.
Is it possible to compile a folder with an ino and one or more .h/.cpp files (and potentially some random other files) without a makefile (as the web IDE allows)?
Or how should I get it to compile? I do know a tiny bit about makefiles etc., but not enough to get this to work efficiently.

Thanks!

Alexander

Hi @alexanderweiss,

Yup! If you’re using the cli, you can compile a whole folder with: spark cloud compile folderName or spark cloud flash core-id folderName. You can put any .h / .cpp / and one .ino file, and it should compile and download the binary, or flash it to your core.

Thanks!
David

Hmmm. Am I doing something wrong then @Dave ?
Whenever I try a folder I get the following error (whereas it does work when I point it to the ino; as long as I don’t include additional files that is).

{"ok":false,"errors":[{"ok":false,"output":"Compiler timed out or encountered an error","stdout":"Building core-common-lib\nmake[1]: Nothing to be done for `all'.\n\nBuilding core-communication-lib\nmake[1]: Nothing to be done for `all'.\n\n","errors":["build didn't produce binary Error: Command failed: make: *** No rule to make target `../**id**/.o', needed by `**id**.elf'.  Stop.\n","make: *** No rule to make target `.o', needed by `**id**.elf'.  Stop.\n"]}]}

(I just removed the id; it’s probably not an issue, but just to be safe)

Hi @alexanderweiss,

Hmm… id should have been a filename, what directory are you running the command in / what files do you have in that directory?

Thanks,
David

Hi @Dave,

It’s just a directory in my home folder with:
Cakefile, lights.ino, vars.h (I tried without Cakefile, but the error remains the same (apart from the folder name obviously).

Would you like the folder name in (one of) the errors? It’s got nothing to do with my local setup right?

Thanks for the help.

Dave, I just tried a folder compile with no build.mk in the folder and got an error indicating a missing file, even though the file is there. An the reference to DHT22 files makes no sense.

Errors
In file included from ../inc/spark_wiring.h:30:0,
                 from ../inc/application.h:31,
                 from DHT22.h:1,
                 from DHT22.cpp:1:
../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning
 "Defaulting to Release Build" [-Wcpp]
File.cpp:15:16: fatal error: SD.h: No such file or directory
compilation terminated.
make: *** [File.o] Error 1

Compile failed -  compile failed

Hmm, just tested this again on my end, build.mk is not necessary. Are you guys using the latest CLI? npm update -g spark-cli

The error you provided is saying it got some file and it doesn’t know what to do with it, so I’m guessing maybe remove Cakefile from that folder? Also the builds are case sensitive, so make sure your filenames / includes match their case.

I think I got it yesterday, but I’ll try updating (once my internet works again). I did try without the cakefile already, resulting in the same error.

Hi @peekay123,

Can you confirm you’re running the latest version of the CLI, and that your include line is #include "SD.h" and not #include <SD.h>

Edit: looked back at your private message and see you actually have File.cpp in there… :slight_smile: Nevermind about that part.

Thanks!
David

Dave - updated to the latest before the compile. Did an update again with same results. I’ll check the include syntax and try again.

UPDATE: I fixed the include references and any related errors seem to have disappeared but I get an error saying spark_disable_wlan.h is not valid file. So I guess the web IDE does not support this include yet.

1 Like

Indeed all this cool stuff hasn’t been merged into compile-server2 along with other stuff like fixes for Spark.publish() but we will know it once the :spark: team as done so :smile:

Ah… I found the problem. It also doesn’t know what to do with OS X’s .DS_store file and doesn’t ignore it either. So that’s why I get the error even when I remove the Cakefile.
If I remove both Cakefile and .DS_store, it does work (until .DS_store reappears).

@dave maybe the .DS_store needs to be in the .gitignore as per Spark-cli?

Ah, yup! oh silly files. :slight_smile:

I can ignore targeted types of files, or I can only include select sets of files. I’m tempted to only include “.h/.cpp/.ino” files, but then I feel like I’m going to miss some file type people want. Thoughts?

Thanks,
David

Blacklist sounds better since this doesn’t happen that often and we need those .c as well

Hmm, lets try that for now since it’s easier to recover from having an extra thing included, vs. something they want not being included. I think I’ll also make it noisily list all the files it’s sending out, so that people can more easily spot stray files being sent.

I have my code under RCS so there is a sub-dir for that, then there is a Makefile for reasons which will become clear below but also for pre-processing or autogen of code e.g. if I am generating a const C-array from underlying CSV table, the CSV file and the awk prog to generate the C are also in the directory, then there is a doc file saying what must be connected to which pins and how to use the app. And then I need to build for different Arduino targets not only Spark.io, and there are files such as main.cpp.old and xyz.h.french and xyz.h.german. And there are dummy make target files and files to hold the spark compile output. Better, in other words, to be able to exclude all files except those named on the command line. And the default action in the Makefile can then be ‘spark cloud compile foo.c bar.cpp wiz.h’ - well that’s what I would prefer. Currently the default action is as follows:

default: foo.flashed
  rm -fr build #spark flash needs a dir with just the code
  mkdir build
  cd build && ln -s ../foo.c ../bar.cpp ../wiz.h . #cd only applies to this line
  spark cloud flash $(core_id) build | tee foo.out #save the pages of compile errors
  grep -iv error foo.out >/dev/null #tests for no error
  touch foo.flashed #make doesn't continue after an error

Save me from that!

And spark cloud flash ... should exit non-zero status upon failure so Makefiles and other *nix utilities can tell!

1 Like

Hmm… So if you provide more than one filename or directory name, it assumes it’s a list of files?

something like:

spark cloud compile foo.cpp bar.cpp bar.h baz.cpp
spark cloud flash coreid foo.cpp bar.cpp bar.h baz.cpp
1 Like

In my (old fashioned and stuck in my ways) view something which looks like a command line compile command would be great, something analogous to cc. So syntax such as you propose would be great, then I need not worry about populating .gitignore (or .gitonly). I think (hope!) others too will find the idea of naming the files really nice. And that is good enough, really, especially coupled with non-zero exit status on failure.

But there is no end to my requests! Why cannot the files be in different directories. E.g.

spark cloud flash 123123123 ../lib/libtemp.cpp ../include/libtemp.h boiler.ino

or, so that our own includes can #include <foo.h> rather than #include "foo.h", how about cc’s -L and -I options? OK, that’s possibly a step too far. But I would use that feature.