Can I get serious about using a SparkCore?

Looking at using the Spark Core in my next relitively serious project since it fits nicely but few things are unclear. The following question might come off a bit snarky but honestly asking out my ignorance of how the spark ecosystem currently works. Want to keep from clouding my judgement in what micro platform I chose to base my new build on. Leaning towards sticking with the Arduino Micro because of my familiarity with it.

Libraries? Want to use adafruit’s 9dof break-out to proto for example, how hard is it to port libraries? Just tried setting one up and its looking for things like Arduino.h and Wprogram.h. Have little time/ skill to build a library from scratch.

Git-hub? I’ve done some test code in but I see no simple way to use version control. Cut and paste?

Find and replace? Simple way to use text editor of choice? We still talking cut and paste?

inof8or, the Spark has an amazingly supportive community, ready to help! Porting of libraries is a common request and many users are more than happy to do the port.

The Spark’s web IDE is still being developed with new features being developed continually. Many “serious” developers chose the Spark CLI (command line interface) or a local build chain, allowing them to use their own version control tools and favorite editor. There are tutorials on using Eclipse and Netbeans for building locally as well.

Eventually, there will be GitHub integration for libraries and user code. Until then, there are excellent options you can chose from to get you going. :smile:


inof8or, I quickly looked at Adafruit’s unified sensor library and I like it. I already ported a LSM303 library for a member but I prefer Adafruit’s. If you want, I can take a shot at porting it but you will have to test it :wink:

Just stumbled across the cli option just before you replied! I’m going to install it and see out it goes.

Going for building an OSHW project so reducing bariers is a big thing. I’m on ubuntu, assuming the point of using node is to build cli/ide is so it plays well with every os right?

Please! I perfer it over polulu’s library for the lsm303. It will probably be better for the project you ported for. I love adafruit because they really care about producing usable and clean libraries.

Ok, give me some time and I will work on that and post it on my github when it’s ready to test :smile:


inof8or, I have everything ported but now I need to compile everything to make sure that works. I will be setting that up tomorrow or so and if all goes well, I will post on my github. :smile:


This is great, Thank you, really appriciate you taking the time! You have probably already noticed this, but I will bring it up anyway just so the scope of this is out there. The unified sensors library seems to just set the parameters for conversions which is dirived by the way android does it. Then there are libraries for individual sensors, this case with the 9DOF one for the LSM303 (accel/compass) another for the L3GD20 (gyro) and a 9dof library for combining sensor data (IMU). Finally the wire library is needed, but I assume this as already been done or there is some equililent of it right?

Regardless, I recognize this is probably a bit of an undertaking, I’ll work on getting you the gear if you are going for it. Would love to see how it’s done so I can understand a bit my self. If you could upload to git from the point of the original libraries so I call pull and look at diffs that would be awesome. That way I might be able to help eventually.

Missed this before my last post, your fast!

Got the cli working a minute ago! Super excited! The web IDE is nice, but its hard to replicate what everybody has in their local development environments. The spark cli was a good idea, relitively self explainatory too, good help prompts.

1 Like


we are pushing big time on the Spark-Cli. If you look at the github issues, I added 3 enhancements just last night and more will be added as new features for :spark: in general gets rolled out.

IF you get yourself a :spark: core, you are signing up for an adventure where it doesn’t just stop at the hardware! New stuff happens, the web gets better and you learn more than what you can every imagine :smile:

Hope this helps!

1 Like

I’m a little embarased to say this, but I backed the kickstarter and I’m just now considering using it for my project.

So, I’ve had one for a while :smiley: just have been a bit busy developing on the arduino mircro and had some firmware hangups with the spark intially. Recently, I fried the poor little micro though. Considering just using the spark instead of buying a new micro.

If my project is succesful though it will mean a lot of interest in buying cores to replicate it. Want to be sure folks greener then myself can pick it up and replicate my results, modify and contribute improvements. I’ll explain the project in a different thread. Probably going to need some help much like peekay123 has offered. I’ll have to see what I can do to get capible and interested contributers all the necisarry hardware.

inof8or, I completed the testing of the ported libraries and all seems to work well. I posted the entire library on my github. You will need to pull the files you need together to create a project. I did my testing by locally compiling the code instead of using the web IDE. Take a look and let me know if you need assistance :smile:

Note that for I2C, there must be pull-up resistors on the SDA and SCL lines for it to work. I am not sure if these are present on the Adafruit board.

Thanks peekay! Just triend to “cloud” compile with the cli,

Adafruit_9DOF.cpp:18:18: fatal error: Math.h: No such file or directory
compilation terminated.
make: *** [Adafruit_9DOF.o] Error 1

Seems to want math.h. Isn’t this part of GCC?

I’m a little unclear with how the cloud compiler is interperting the build path I guess. I put all the nessirary files in the same directory and pointed the “cloud” compiler there. Probably the wrong way to set things up… Is it necisarry? less then optimal anyhow.

inof8or, I did not try it with the spark cli but I will give it a shot. I will copy what I did with my local compile for comparison.

UPDATE: I found a few glitches in the libraries with the syntax of some includes. You may want to grab the files again.

inof8or, I got the spark cli compile to work after the library glitches were fixed. Here is the list of files I put in a single directory (9DOF) to compile:

The last file is Adafruit_9DOF-master\examples\tester\tester.pde renamed to test.cpp.

Let me know how it goes :smile:


Still get the same error, its looking for math.h. Must be something wrong with my set-up…

I a bit fried from so much copying and clicking need to figure out a better way to deal with the work flow of this. I’ll have a better look tomorrow with fresh eyes

Argggghhh!!! I missed some changes to tester.cpp (formerly .pde)!

Are you using the spark cli? It is the simplest and fastest way to do it. After the fixes, everything compiled fine but I did not have any math.h errors however. Also, did you clone my git or just copied from it? If you cloned it then do a “git pull” to get the latest files. Keep me posted :smile:

Success!! Just got it working using the spark cli!

For some reason when cuting in pasting to put the proper things in the build path second copies of files with a tilda proceeding where created, leading to multiple definition errors. I think my text editor was creating these. The math.h dependency I never figured out though, must have been a fluke.

Is there away to direct the include/build path to a “library” folder as opposed to copying everything into one directory?

Thank you for doing this peekay, 9DOF was mainly what I was worried about geting to work.