Atomiot (Formerly Spark Tools) - Monitor Spark Projects with zero code.. and other stuff

The Spark team has made integrating with hardware so easy with the cloud api integration. All I was missing was a quick way to generically work with any Spark project I made without constantly playing with curl or writing new code.
I also needed an app to monitor my home air environment project that needs to log data and graph it.
So I built this web app in my spare time over the last few weeks. I figured the features I needed for myself could be useful for the community here as well.

Here it is. Spark Tools.

You can read variables and execute functions. You can create schedules to log any variable at various intervals, then plot the series of data in a graph.

Here’s a sample graph of temp/humidity in my living room.

Try it out on any connected spark now. You’ll be logging data within two minutes!
http://sccb.azurewebsites.net/

Any feedback is appreciated. Especially if you find bugs!

I know Xively and Plot.ly can do some of this stuff but there are a few issues that didn’t work for me.

  1. They’re primarily for commercial uses
  2. They’re priced way too high for DIY projects
  3. They don’t take advantage of the amazing web integration the Spark team has created

I open sourced the .NET client for the cloud API here.

16 Likes

Cool stuff @kareem613!

I’m also more into C# than Java/Android, due to company use, so your .Net API has made my day :wink:

Thanks for open sourcing it!!! :thumbsup:

1 Like

Nice! Yay C# and HighCharts! :slight_smile:

Hi, this tool seems great. I also want to get the data extracted from the Spark Core, but I don’t want to play with Google scripts. Your solution seems great !
However it’s not working for me, when browsing, there is no variable shown. Do I have to load a particular code on the Spark Core?

Hi @Fabien,

I think you’d need to make sure you have some variables exposed on your code (reference here: http://docs.spark.io/#/firmware/data-and-control-spark-variable )

Thanks!
David

That’s exactly it Dave. Did you do that @Fabien?
I don’t see errors in my error log.

I should probably link to documentation from the site.

I had declared the variables. However after few flashes, the tool finally could browse the Spark, and I could setup a schedule! That’s exactly what I wanted to do, and it’s pretty simple, thanks Kareem !
Just a point, the curve is not continuous, for some reason. I don’t really get it. See the picture below. Is it something linked to Highcharts, or the data?

The gaps show when it couldn’t access the device. Ex. If your device is offline for an hour, you’ll see a gap.
You can see the last error from the monitoring on the schedules page. Hover over the last failure time.

1 Like

That may not have been the issue @Fabien. I think there was a bug in determining if the schedule was due to query the device. I was having consistent 10 minute gaps.
I made the adjustment. Not counting the device being offline, the data should be smoother now.

Yes it is much better now. Thanks !
One point, is there a way to retrieve the data? I could only see it as graph, but I couldn’t find a basic table with the datas. It might be usefull for debug/analysis.

Good idea. A csv download? Or in the UI. Which do you think it’s best.

Just now saw this post. I’ll give it a try soon as I have some time to do it.

I’m still using the Google Drive Graphing.

I believe we could download the data in CSV, in the same way we can download the picture.

Just wanted to chime in and say how awesome this is! Go @kareem613!!!

1 Like

@kareem613 This is awesome!

Is there a way to schedule a function call w/ static params and log the result? That would be nice for cores running the tinker firmware. Then they could run an analogRead(A0) call and log the result instead of writing custom sparkcore firmware to expose a variable.

Good idea @Hypnopompia . I’ll add it to my backlog.

1 Like

Cool. That’s what I was thinking. On the todo list!

Here’s a related idea then… If the user could do an analogRead on A0 with the tinker firmware, and then specify that they have something like a TMP36 temperature sensor hooked up to it, you could convert the analog reading to a temperature value before you log it.

You could support different kind of sensors that output a simple analog/digital reading or something and do the conversion of the reading to more useful data on the server.

The end result there is a log of temp data. This was my original purpose for this web service.
Spark variables are perfect for this.

Is there a benefit of functions over variables for this purpose?

I think the only benefit is that it doesn’t require someone to write any firmware code. If your Spark Tools goal is to monitor spark projects with zero code, I think it make sense to use functions over variables so that new users can use the tinker firmware.

Once they get settled in and are ready to write custom firmware code, then using variables would probably make more sense.

1 Like