Getting Spark to use API info

I’d like to use a Spark to display info from an API (weather, popular Twitter hashtags, exercise amount from Moves) by moving a servo, flashing an LED, or showing something on a display. The hardware part I get, but what do I need to learn on the software side to do this? I’ve heard the following mentioned:


I get the impression I don’t need all of the above, just some combo of the above, and not necessarily an expert understanding either. I still don’t know where to start though, and what tutorials are relevant and useful.
Could someone point me in the right direction and/or explain what each bit does in a general overall sense? Would be much appreciated, am a bit stuck now.

Webhooks should do the job perfectly. :smile:

Thanks, messing around with now.

I’ve seen lots of comments about people “waiting for webhooks”. What are they waiting for and who are they waiting for them from? :smile:

Waiting for it to be released on the :spark: cloud :slight_smile:

Lack of https client is a big limitation (and surprise to me!) I was really excited about having a wifi-enabled devboard, but you can’t do much without https.

I don’t fancy the lack of security of using a 3rd party proxy and don’t really want to have to use my computer (e.g. raspberry pi) as a dependency of the spark, it kind of defeats the purpose. that’s what i do with my arduino, i didn’t expect it from the sparkcore.

so essentially all you can do is read unauthenticated content from the web e.g. an rss feed, not use any api’s e.g. oauth2 and not publish content e.g. send a tweet.

what are people doing to get around this or is everyone just using Spark.publish() and then curl’ing from a server?

HI @sej7278

I think a lot of us are waiting for the feature the Spark team is calling “webhooks” where the Spark cloud acts as the proxy for https transaction on the Internet. Since the only secure connection to/from the core to the Internet is to the Spark cloud, this make a lot of sense. The idea is that you define an action to be taken by the cloud server when your core does a Spark.publish() or an action to be taken on the core via Spark.function() or variable() when another host hits an API on the Spark cloud server. The security implications here are important, so the team wants to get it right.

The problem with https is not the crypto for the link per se since the Spark cloud connection is AES encrypted. The problem is the certificate management and checking with the small resources available. There would be no RAM left if even one certificate had to be stored and checked on the core in my estimation.

If you don’t like the idea of the Spark cloud being your proxy, the team has committed to providing a Spark cloud you can run on your own server instead so you control it.

There are a lot people building services around Spark.variable() like the service that @kareem613 started. Folks have also figured out how to use Google apps to pull data from Spark.variable() into a Google Docs spreadsheet. Lots of stuff can be done with too. I have put together a bunch of tutorials on how to use the Spark cloud API from javascript in a browser, which makes sense if you don’t put your access_token out on the net. I use these html/javascript webpages on my computers and phone via a service like Dropbox, where I can load up a private web page and see and control my cores.

So look around: people are doing cool stuff and have found ways to not be limited by the lack of http until the webhooks feature is fully baked!

1 Like

i’ve just knocked up a tcp server in python which runs on my raspberry pi and tweets whatever input it gets on the lan from the spark. its not great but its more secure than using an api proxy service, i’m no cloud fan.

i could probably tweak it to scrape pages and send stuff back to the spark too, i just didn’t expect to have to do that, i assumed an arm would have the grunt to handle ssl, but its the ram that’s lacking i guess.

so to a degree the spark is not much better than an arduino with wifi/ethernet shield or radio module, it still needs a real pc to do its grunt work.

This is sort-of true in your use case because you don’t want to use the Spark cloud. With the Spark cloud, it is much more powerful. Spark still has more RAM, even though folks are running out, and runs at a much higher clock rate, is physically smaller, can be lower power, etc. I don’t have to have another computer scrape web pages for my Spark, I can do it all on the core.

The great thing is that you get to decide! Don’t like the cloud? Don’t use it.

not if the page uses https, you still need the pc to do it for you, and lets face it, in 2014 anything useful is on https

i’m not so much against using the spark cloud if they get the security right, i certainly don’t want to use some random 3rd party cloud service though.

1 Like