The Spark team has made integrating with hardware so easy with the cloud api integration. All I was missing was a quick way to generically work with any Spark project I made without constantly playing with curl or writing new code.
I also needed an app to monitor my home air environment project that needs to log data and graph it.
So I built this web app in my spare time over the last few weeks. I figured the features I needed for myself could be useful for the community here as well.
You can read variables and execute functions. You can create schedules to log any variable at various intervals, then plot the series of data in a graph.
Here’s a sample graph of temp/humidity in my living room.
Hi, this tool seems great. I also want to get the data extracted from the Spark Core, but I don’t want to play with Google scripts. Your solution seems great !
However it’s not working for me, when browsing, there is no variable shown. Do I have to load a particular code on the Spark Core?
I had declared the variables. However after few flashes, the tool finally could browse the Spark, and I could setup a schedule! That’s exactly what I wanted to do, and it’s pretty simple, thanks Kareem !
Just a point, the curve is not continuous, for some reason. I don’t really get it. See the picture below. Is it something linked to Highcharts, or the data?
The gaps show when it couldn’t access the device. Ex. If your device is offline for an hour, you’ll see a gap.
You can see the last error from the monitoring on the schedules page. Hover over the last failure time.
That may not have been the issue @Fabien. I think there was a bug in determining if the schedule was due to query the device. I was having consistent 10 minute gaps.
I made the adjustment. Not counting the device being offline, the data should be smoother now.
Yes it is much better now. Thanks !
One point, is there a way to retrieve the data? I could only see it as graph, but I couldn’t find a basic table with the datas. It might be usefull for debug/analysis.
Is there a way to schedule a function call w/ static params and log the result? That would be nice for cores running the tinker firmware. Then they could run an analogRead(A0) call and log the result instead of writing custom sparkcore firmware to expose a variable.
Here’s a related idea then… If the user could do an analogRead on A0 with the tinker firmware, and then specify that they have something like a TMP36 temperature sensor hooked up to it, you could convert the analog reading to a temperature value before you log it.
You could support different kind of sensors that output a simple analog/digital reading or something and do the conversion of the reading to more useful data on the server.
I think the only benefit is that it doesn’t require someone to write any firmware code. If your Spark Tools goal is to monitor spark projects with zero code, I think it make sense to use functions over variables so that new users can use the tinker firmware.
Once they get settled in and are ready to write custom firmware code, then using variables would probably make more sense.