Best framework for data logging

I'm trying to brainstorm about the most efficient layout for my Particle project. I've got a Spark Core that reads sound intensity every minute, it also contains a switch to turn the measurement off or on to start a new measurement session (a session typically lasts around 18 hours).
So I would like to store a integer values every minute on a server so a user can go to this server using a web browser and see the different sessions and select them to visualise a individual session in a graph.
In the future I plan to let multiple users use multiple sparks to acquire and visualise the data.

I already came across a few solutions on this forum, but I'm wondering wether I covered all the option.

Put Spark data in a google doc. Not really an option here since it's not really scalable to multiple users/multiple cores

Directly push the spark data in a mysql database on the server. This sounds like a good idea. Although this is only one way communication from the spark to the server
https://particle.hackster.io/yleguesse/spark-non-invasive-smart-electricity-meter

Setup of a nodejs server with mongodb to query the spark. Sounds also like a good idea and facilitates bi-directional communication. However I'm wondering wether this is as stable as the mysql example?

Maybe instead of in the last example use influxdb instead of mongodb as described here (http://influxdb.com/)

Any more thoughts on this?

Thanks!

1 Like

I’ve been using Graphite/Carbon + StatsD for a while. It stores the metrics to disk. The graphing and calculation capabilities of Graphite are utterly amazing. I’ve even written a simple spark-carbon-library to push metrics from your Core/Photon directly into Carbon. (I should probably do one for StatsD as well).

The StatsD part is optional (but still available). The main thing I like about StatsD is that it can do “gauges” that will store the last known value of a metric and submit it every X seconds to Carbon so that you can have a continuous line graph instead of a bunch of dots.

Another option might be a Redis database. Someone wrote a Redis library a while back, but it hasn’t been ported for the web IDE. I haven’t used it personally, so I can’t offer much feedback beyond that.

I tried InfluxDB in the past and wasn’t pleased of how well it didn’t scale when it had tens (or was it hundreds) of thousands of records. I probably wasn’t doing any sort of proper roll-up of old stats. My opinion here may be tainted due to my own lack of effort. :wink:

There’s also SparkFun’s Phant to store data. There are a few different libraries available in the web IDE that may help: PietteTech_Phant, phant, Spark_Phant_Library. I haven’t used Phant or the libraries, so some extra leg work may need to be done to evaluate all of the options.

I’m personally planning on sending all my metric data from my Cores/Photons using Spark.publish() and write a listener/relay in Node.JS to watch for those messages and relay them into Graphite. That way, whenever I add a new device, there will be zero additional configuration on the server-side to pick up new metrics. And maybe I can write up a plug-in for Freeboard that will load historical data and plot new incoming metrics in real-time.

I’ve been researching this stuff off and on for a while and still haven’t found an easy solution that satisfies all my needs (fast, cheap, high-resolution, and no hard caps on metrics or storage).

5 Likes

This looks amazing, thanks for the writeup.

I’m using dataloop.io for my monitoring at the moment, so they might be a solution to your data storage requirements?

I’ve just received my electron through the post, so if I get a chance I’ll try and get your carbon library working with that over the next few weeks :slight_smile:

Just found this few days ago, works pretty good:
http://ubidots.com/docs/devices/particlePhoton.html

1 Like

@wgbartley thought I’d see if you had any updates on your methods since you posted this?

Your post gave me a TON of ideas. Thanks!

1 Like