I’m currently pursuing a commercial project and wanted to get everyone’s thoughts on the Spark Core vs The Electric Imp. I like both but am intrigued by the open source nature of the Spark Core vs the IMP. It would be nice not to have major dependencies for my product permanently tied to a cloud service I don’t have much control over (and have to pay for). Forgive me if any my questions below seem amateurish - I’m still trying to wrap my head around this stuff and have a lot of gaps in my knowledge.
Can anyone help me understand the key limitations of the following:
1.) 20kb vs 60+kb RAM - is this as big of as a deal as it looks?
2.) Electric IMP Agents - I’m a little confused on how the Spark Core doesn’t have something like this - I thought this was the “Cloud” portion of the Spark Core/IMP. How would I create this functionality using the Spark Core or is that even possible?
3.) Security - AES for the Spark Core vs. TLS for the IMP; is this Spark Core secure enough in comparison?
4.) Wiring vs. Squirrel - Any advantages/dis-advantages here?
Has a timeline been mentioned for the next iteration of the Spark Core that is more powerful? My go-to-market window is small that may end up forcing my hand in choosing the IMP.
I can only input some ideas but @zach is the best person for this!
Ram. There are instances where users run out of ram on the core due to what they are trying to do. So if a bigger ram gives you such problem, you will definitely hit it on a small one.
So bigger ram = better for most cases.
the core is built with security in mind. There’s also rotation of keys (not sure the exact term) for the messages between the cloud and core which makes security stronger. Not sure about the IMP.
Wiring vs Squirrel. I guess it’s about which IDE/code you are more comfortable developing in. But maybe im wrong.
there’s some discussion recently on commercial purpose cores and let’s wait for the team to fill us in. What differences are you hoping to see? If not, your customised shield add on might be able to do the job.
Thanks for the quick reply! Hard for me to know my ram requirements at this point - I will mainly be switching valves but on a schedule that is loaded from the cloud. I will also need the unit to report data back to the cloud - such as some sensors reporting leaks back so I can message the end customer. These would be pressure based so I would need to baseline and then report if there was a deviation.
As far as difference - mainly I’d like to see more RAM and possibly having “IMP Agent” style functionality on the Spark Core. I’m sure there will be other things asI progress but that all have for now given the extent of knowledge.
Hi @Jag,
I'm not realy familiar with the Electric Imp and its Agent feature, but have you had a look at the recently added Publish feature of the Core?
As I understand the Agent it's not quite the same but for what I would use it, I could also use the Spark.publish().
For what I got of your use case description, I'd not immediately see a RAM issue - unless you do some heavy processing or use some RAM hungry libs.
One for the Core is the option to either run your own Cloud (once it's open sourced) or run completely Cloud detached by implementing your own TCP/UDP communication.
Another big I'd think is this very forum - I'm not sure if Electric Imp has a comparably agile community helping out, when you happen to get stuck.
I completely agree
my vote for Open source and forum.
No other forum like this for such a short time is not so expanded!
If you are limited by the RAM memory and works for complicated project you could use another microcontroller as master.
For simple projects just the core, while for other projects SPARК + MCU.
I think the price is good, unless it works for mass production.
I think the link-level security is the same AES in both cases. The hard part of doing TLS on a small micro is certificate management. I don’t know enough about Imp to say how they manage the certs, but I can say on Spark, they have chosen a good, well-founded public key crypto model that is still light-weight enough to not take all the resources just doing cert and key management.
RAM can be tight on the Spark core right now, but the team has not spent too much time optimizing so there is room for improvement. For things like your valve schedule, let’s say it is 50k bytes for some reason, you can add a $2 external SPI 128kx8 RAM. Or you could add a SD card. Or use the on-board FLASH, for which there are some problems right now but they will get worked out. RAM is only a problem if you have no room or budget for external expansion. You might find that logging data to an SD card is a good feature for you project.
The language for Spark is wiring, but you have access to all of C++. Particularly with local builds, you can do anything, including turn-off the Arduino-style wiring preprocessor and go straight C/C++ if you want to. It is a standard ARM gcc cross-compile flow that you would use for any embedded ARM project.
This is a question we get asked sometimes, happy to help provide extra info:
1.) The Spark Core is designed to be easy to develop and prototype on, as well as something you can bring to scale when going to production. The current chip has 20KB ram, but anything you write on the Core now will be easily portable to future models. If you’re just tossing a few values around, it’s tends to be more than enough. If you’re driving large displays, you might have to get creative at the moment.
2.) This is something we’ve started to play with, I would not be surprised if you see this sometime this year.
3.) Security on the Core is something we spend a lot of time thinking about, our protocols are open sourced, each core gets a unique public/private keypair, and the core maintains a 128bit CBC AES session with the cloud while it’s online. You also get this same level of security when you’re running the local cloud (still in development).
4.) On the Core you have total reign to modify everything down to the bootloader, and that code is all open sourced. This means you don’t pay us a monthly subscription, and you can run your own server, and your code isn’t sandboxed. This also means you have the benefit of an awesome community.
5.) You can communicate with your Core directly without using somebody else’s servers, if you just want to move a bunch of data around locally, or control things locally you can!
I know this wont be that relevant to you since you seem very competent, but for a newbie like myself, I never touched my IMP after I got my Spark Core. The main reason is that it couldn’t support the Arduino language which all of the newbie videos and tutorials out there are written in. The idea of learning Squirrel when I have been trying so hard to learn Arduino didn’t sound appealing. I have had my IMP since day one, the closest I got was to turn on one pin. With the Spark I can do anything it seems. Don’t get me wrong, I am still a bit lost without understanding RESTAPI. IMP made things easier with their plugins and Agents. Again, just a newbie perspective.
The agent is one of the things I really like with the Electric Imp. Instead of having the microcontroller doing all the work with calling a url and extracting the things you want from it, the agent can do it all.
An example is my youtube follower counter. It calls the youtube api and gets everything returned, it then finds only the part I want to use, which is the number, then sends the number to the microcontroller. In my case it is below 100, so I only send two characters to the microcontroller itself.
All this was done while the microcontroller was doing something else, and was only interrupted by the two characters it got from the agent.
Same thing the other way around. You can set all the functions up in the agent to do the calls to different things, and only have the microcontroller tell the agent to execute the function.
This is a tricky one, each language got their own strengths and weaknesses. With Squirrel I absolutely love how you can just start timers, instead of checking millis() and calculating the time. But I don't like how each pin is set to output/input and high/low, on the other side I again love how easy it is to define pins for i2c and uart communication, and changing pwm frequency and such.