Couple questions on how the spark core works

I am thinking of buying a spark core but I wanted to know a couple of things about it first.

  1. In what way(s) can a mobile device like an iphone talk to the REST api? Is there an iPhone app, or does the core host a web server?
    How does it compare to this youtube video of REST with arduino

2)Can any code that is designed for an arduino work with the spark core as well? For example this

  1. What do you mean when you say REST is automatically built in to the core and how does the cloud come into this?


I’ll try to answer to the best of my abilities:

  1. REST is a web convention. Your device (iPhone) can access these through a web browser (jQuery, python, php, html, stuff like that), or through native apps, coded in I guess C. Most programming languages can handle REST stuff. There is an iPhone app for the Spark Core, but this is mostly for configuring the WiFi, and to to some very basic testing (toggling/reading pins). Like the video, the Spark also contains a CC3000 WiFi module which can connect to the internet. I figure the general idea is mostly the same.

  2. The Spark should be quite more powerful than your Arduino due to its powerful processor. It should also be able to run most, if not all, Arduino code. However, since the Spark is based on a different structure (ARM), some modifications might be required. Especially Arduino hardware specific code requires some extra attention. Some libraries on the Spark are included by default (OneWire for example), while others have to be manually included. These are some of the things you’ve got to watch out for when porting code. Luckily we’ve got a very helpful community who is, more often than not, highly welcome to helping you out. Maybe @peekay123 can tell you some more about Arduino vs. Spark compatibility, seeing as he’s one of those people who’s constantly porting all kinds of awesome stuff.

  3. That means that you won’t have to worry about any kind of connectivity structure. Everything is set up upfront, and all you have to do to initiate a function is call a certain URL using your credentials (accesstoken/ Core ID). The URL goes to the Spark cloud, which will then call the specified function on your Core. This is a fully encrypted connection, which ensures that you don’t have to worry about your privacy, or someone controlling your appliances. The Cloud provides some nifty functions, like the REST API, Server Sent Events, synchronised time. All in all, a really neat service. If you’re worried about being locked it, don’t be. Not only is there a local cloud available, which you can host yourself, you can still connect with your core via traditional means: TCP, UPD, stuff like that.
    You can check out the Docs for some more info about how the REST stuff is integrated with the cloud. Maybe some examples to give you an idea of what it will look like in action. The Tutorial section is also a great place to read up on, and learn some new things. There are some nice examples of web pages to be found there, which can interact with your Core via the Cloud/REST things.

I am by no means an expert on all things Spark/Arduino/Web/REST/Cloud related, but I think that makes it even better. Without understanding all of the above, I’m still able to make some really neat things, and learn a thing or two(thousand) on the way. The Spark is great because it works out of the box. You plug it in, connect with the app, and you’re ready to control stuff. You don’t have to worry about all things related to connectivity, it’s just there.

Final suggestion: BUY ONE, you won’t regret it ;)! (I know I haven’t so far…)

(Tagging some people for confirmation of the above, or possible corrections if I’m mistaken: @kennethlimcp, @Dave, @bko. <-- they know stuff. That should suffice :blush:)


Thank you so much, the reason I didn’t just immediately gets spark core is because I want to be able to run some projects showcased on the arduino on the core. If I’m certain I can, I will buy it right now. I’m interested in knowing what types of modifications the code would need?

1 Like

Just go for it. Even if that boring old Arduino stuff doesn’t work (unlikely), you’ll come up with so many other ideas, which makes you eager for more Cores. The modifications depend on how the code is written, and how complex is it. A “blink an LED” should work just fine on both. As soon as micro controller specific functions are used, you’ll probably have to edit them for the core. Some modifications are as easy as putting in include "application.h", while others require more extensive modification. Like I said, @peekay123 should be able to tell you more about that, and @BDub might as well. They’re responsible for quite some library ports and have a better understanding of what it takes to make certain stuff Spark compatible. In any case, if you’ve got problems with porting anything, there is a community out here more than willing to help out! It’s the most friendly and helpful bunch of people I’ve met on the great big Interwebz so far!
So don’t worry about compatibility, just get one and enjoy. The longer you wait with ordering one, the longer you’ll have to wait before the fun can begin :wink:

Wow, thanks for the kudos @Moors7! A lot of Arduino code can be ported to the Spark with little effort but as @Moors7 pointed out, hardware specific code requires special attention. A lot of the toughest libraries have already been ported and more are being ported daily. The example code you pointed to can be ported with changes to the pin designations and little else. There may be adjustments to the hardware due to the Sparks lower 3.3v operating voltage as well.

However, that example seems pretty lame for a Spark that has amazing Cloud connectivity and more processing speed! Think bigger! :stuck_out_tongue:


Thanks guys! It looks like I should get the spark then.

1 Like

Go ahead and get yourself a Spark Core. I’ve written a lot of code in pure Wiring (the Arduino language) and it all works fine on the Core. There are a few minor things to look out for – such as int on a Core is 32 bits (same as a long) while an int is 16 bits on an Arduino (and a long is 32 bits). But this is more of an issue if you declare ints on a Core for 32 bit math and then want to port your code back to an Arduino.

From the software point of view, the main issue in going from Arduino to Core are the libraries. As has been mentioned above, lots of libraries have been ported from Arduino to Spark. You see only a few of the “official” ones in the online firmware documentation. However, when you sign up for your free Spark account and open the web IDE, you can see a whole lot of other libraries that have been ported. So make sure that your Arduino projects that use libraries have the necessary libraries ported over to Spark. Writing and porting libraries is not for the rank beginner, but if you need a library ported, post your request on this site and I’m sure that someone will take up your cause!

On the hardware side, pay attention to the fact that Core I/O is at 3.3 volts and may not be compatible with 5 volt devices. The “D” pins on the Core are 5 volt tolerant, however. I have hooked up some 5 volt 315MHz and 433 MHz receiver modules to the Core using the VIN pin to supply about 4.4 volts out as Vcc for these modules. When powering the Core from USB (5 volts), VIN will supply out 5 volts minus one diode drip. The data pin on these 5 volt modules puts out the module’s Vcc and I had no problems connecting these to D0 and D1 pins of the Core, even though the high voltage is higher than 3.3 volts.

If you want or need to level convert, you can make a simple level converter out of a transistor (e.g. 2N2222) and two resistors (47K will do nicely). Connect one resistor to the base of the transistor, the other resistor to the collector of the transistor and ground the emitter of the transistor. To convert 3.3 volts from the Core to 5 volts for external logic, connect the other end of the base resistor to an I/O pin on the Core, connect the other end of the collector resistor to +5 volts and the output to the external logic is the collector of the transistor. Make sure that your 5 volt supply’s ground is connected to the Core ground! For inward conversion, the input from the outside world is the other end of the base resistor and the other end of the collector resistor is connected to Spark 3.3. volts and you will get 0 and 3.3 volts into a Spark I/O pin with any external DC voltage greater than about 1.7 volts. (Again, make sure that the grounds of the two power supplies are connected together). You can also use an opto-isolator in lieu of a transistor if you need to isolate power supplies (not connect their grounds together), albeit you will need smaller resistors to draw more current to power the opto-isolator.

There are also voltage converter chips for multiple conversions and, of course, the Spark shield shield for Arduino compatibility – both voltage and pinout.

I’ve got a bunch of Arduino’s around here, but I’ll be developing for Spark (the Core now, Photon in March) in the future. Why not – Spark is cheaper, faster, has more memory, and Internet connectivity. What more could you ask for!

P.S. The one thing that you get with Arduino is the Serial Monitor in the IDE, which is not available on the Spark IDE. However, you can debug Spark using serial I/O if you have a terminal emulator program on your development maching. I use PuTTY on Windows – it works just fine as long as you open PuTTY only AFTER your Spark Core flashes your latest firmware.

The Spark Dev (offline IDE) does have a Serial Monitor :wink: Get it here:

1 Like