Best way to setup Spark project (boilerplate)?

I have been using my Spark for about a month now and have gotten some fun things working, but I am running into issues with compiling (via the command line). I have also searched the best for this as I can, so sorry if this is a duplicate.

I want to know the best way to setup a Spark project. One that can be compiled with spark compile.

Here is my current setup, which I copied the Makefile ideas from another repo.

But, now I am trying to include the newer version of the ArduinoJson library and its structure has changed and I cannot get things working again. So, I am very curious as to the actual ideal way to do this, and I cannot find any information about this.

So, I would love a boilerplate project for Spark, specifically that uses multiple libraries, and specifically ones that are for Arduino and have a more complex structure to them. An example project, or a walkthrough would be great. I feel really in the dark about this.

I come from a web application background, so the make workflow, though I understand the basics, is kind of foreign to me, so the more hand-holding the better.

For reference, the Spark example repo is really helpful and explicit but I feel like this is a different need.


make files are not used when compiling in the cloud.

You can simply place the libraries in one project folder and #include in your project then compile in the :cloud: :wink:

1 Like

Thanks for the response @kennethlimcp.

This does not seem to work. I am pretty sure I need to put the paths to things in a spark.include otherwise I will get a file not found error when trying to include a file. But I am not sure what the actual mechanics of the spark.include and #include statements are and whether I need to include all the library files. Is there documentation about the spark.include and spark.ignore?

@zzolo, it’s nothing too hard so let me try to explain simply :wink:

1.) It should most likely be a bug in 0.4.6 due to a new change for spark compile but has now been fixed but yet to be bumped.

You can use npm install -g spark-cli@0.4.5 for now to revert 1 minor version backwards and use the spark compile dir_folder_name command

2.) Spark.include is not necessarily required since ALL files in a directory is pushed by default

3.) Docs for Spark-cli can be found here and should answer some questions of yours :wink:

4.) #include is like a pretty standard C programming thing. So example: #include "SD.h" will pull in the SD.h file during compilation of code.

You can paste the error code you get when compiling and i’ll see what’s wrong with it?

Let me know how it goes :smiley:

1 Like

Thanks again for the reply and the help. I have downgraded the Spark CLI to 0.4.5. I do know how to use the command line just fine, what the problem is that I am not sure what is the ideal way to set this up and where I might have errors.

Here is my current code:

For reference, here is the ArduinoJson library:

Here is the error I get when I run spark compile . lumiere.bin:

spark compile . lumiere.bin
attempting to compile firmware 
pushing file: lumiere.ino
pushing file: lib/HttpClient/HttpClient.h
pushing file: lib/ArduinoJson/include/ArduinoJson.h
pushing file: lib/ArduinoJson/include/ArduinoJson/JsonArray.hpp
pushing file: lib/SparkCore-NeoPixel/neopixel.h
make: *** No rule to make target `JsonArray.o', needed by `5330d0d2072706a14331ab1e2d255c28cbe6d3b1a407b869b4fdd32502c4.elf'.  Stop.

Compile failed -  compile failed 

Just a random thought; have you checked whether or not the ArduinoJSON library is directly compatible with the Spark? Although arduino and Spark share a lot of similarities, there might be some device specific differences which require the library to be ported. Perhaps @peekay123 can take a look at it, since I know he’s has quite some experience porting libraries(?)

That’s an excellent point, @Moors7, and honestly I don’t actually know. But, the previous version of the library worked fine, and it seems like my current problems are getting it included in the first place, but I am no expert on this stuff.

1 Like

@Moors7 and @zzolo, the ArduinoJSON library should be good out of the box. However, the .hpp files refer to includes in sub-directories which may not be found by the CLI. You may want to flatten out the directories or include those sub-directories in your spark.include file.

1 Like

What does this error mean? Whether I include all files or flatten it out, I still get this.

make: *** No rule to make target `JsonArray.o', needed by `5330d0d2072706a14331ab1e2d255c28cbe6d3b1a407b869b4fdd32502c4.elf'.  Stop.

Im guessing the “.hpp” file might be the culprit.

Let me try to compile it :slight_smile:

Thanks, @kennethlimcp; any luck?

Any updates? I’m getting a similar error with the same library (ArduinoJson) Thanks!!

Hey @ubergeek82. No luck on my part. I moved my efforts to using an Arduino Yun instead. I would like to get the Spark to work, but I am stuck as well.

My current work updating the AJSON library is here:

The Spark makefile just does not have a rule for how to make a .hpp file into a .o file. If you rename this file to .cpp it should work.

If you build locally, you can change the makefile easily to include .hpp files.

1 Like

@bko that worked, thank you!!!

1 Like

The 30 led strip neopixels finally dropped in price, so I’m back at it. I had this working on a Core (for the most part), but have since moved to a Photon and am running into some problems. I did a simple copy-paste from the original github repo only to error out with the

#include "JsonParser.h"

Sooo… I included all of the ArduinoJson lib files as tabs in the web IDE thinking that should cover that. I will humbly admit I’m a little lost and appreciate any direction/comments/feedback. My latest error output is below.

willing to switch to the CLI…

jsmn.cpp: In function 'jsmntok_t* jsmn_alloc_token(jsmn_parser*, jsmntok_t*, size_t)':
jsmn.cpp:11:25: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
  if (parser->toknext >= num_tokens) {
HttpClient/HttpClient.cpp: In member function 'void HttpClient::request(http_request_t&, http_response_t&, http_header_t*, const char*)':
HttpClient/HttpClient.cpp:176:19: warning: unused variable 'firstRead' [-Wunused-variable]
     unsigned long firstRead = millis();
In file included from SparkJson/./ArduinoJson.h:7:0,
                 from SparkJson/SparkJson.h:13,
                 from SparkJson/SparkJson.cpp:1:
SparkJson/././DynamicJsonBuffer.h: In destructor 'ArduinoJson::DynamicJsonBuffer::~DynamicJsonBuffer()':
SparkJson/././DynamicJsonBuffer.h:20:33: warning: deleting object of polymorphic class type 'ArduinoJson::DynamicJsonBuffer' which has non-virtual destructor might cause undefined behaviour [-Wdelete-non-virtual-dtor]
   ~DynamicJsonBuffer() { delete _next; }
../../../build/target/user/platform-6/libuser.a(JSMNSpark.o): In function `jsmn_init(jsmn_parser*)':
JSMNSpark/JSMNSpark.cpp:147: multiple definition of `jsmn_parse(jsmn_parser*, char const*, jsmntok_t*, unsigned int)'
../../../build/target/user/platform-6/libuser.a(jsmn.o):jsmn.cpp:135: first defined here
../../../build/target/user/platform-6/libuser.a(JSMNSpark.o): In function `jsmn_init(jsmn_parser*)':
JSMNSpark/JSMNSpark.cpp:263: multiple definition of `jsmn_init(jsmn_parser*)'
../../../build/target/user/platform-6/libuser.a(jsmn.o):jsmn.cpp:251: first defined here
collect2: error: ld returned 1 exit status
make: *** [96f1257404a962825da258c799f8ecbf27524770f0c81b2df0b38f3f6a61.elf] Error 1
Error: Could not compile. Please review your code.

You seem to have two libraries imported that both declare the same function and the linker can’t decide which one to use.

Oh sweet goodness… thanks, @ScruffR!
I had been popping in some various libraries to see if I could get them to work, and completely forgot to remove them after manually adding the ones I needed.
(for others / future reference)
Wow - that was easy… just clicked on <> to view stats of current app, then noticed the Included Libraries had a few too many. Removed unnecessary ones, and working like a champ now.

-Much obliged.