Collecting and storing data from the cloud through SSE/Spark.publish

I’m currently working on a project that puts a core into deep sleep for 15 minutes then wakes up to take a reading, sending it to the particle server via Spark.publish. My goal with the project is to collect the data in a matter that can be scaled for more devices though, so just using a simple dashboard view of device readings doesn’t exactly make sense. I need some kind of better visuals, whether it be on a map or other chart where a large number of devices can be seen and checked on.
The problem with using spark variables and sending get requests for data from the core is that the core is only awake for a short period and it isn’t the most consistent because sometimes the majority of the period will be used to connect to wifi and the cloud. Is there a cloud based software or site out there that anyone has had experience with that would be capable of handling a situation like this (using SSE to receive and log data from published events)? I’ve spent a lot of time looking around for a solution that isn’t way over my head (I’m not a computer scientist or professional programmer by any means) and haven’t come up with anything. Also if anyone has a similar project to mine I’d love to hear about it and how you got it working, especially on the web side.

Hi @mwbrady

In addition the Spark dashboard, have you seen my tutorial on multiple cores using publish?

This is a great way to get started with your published events since each device gets it own row in table, like in a spreadsheet. It makes it easy to debug.

Ask questions! Folks are here to help!

I actually hadn’t seen that post yet. That looks like it could be a pretty good temporary solution for me. lt sounds like it is still updating dynamically so there wouldn’t be a log of events at all though. I’m really looking at visualizing the data in an intuitive way because just having a list of devices showing sensor readings will end up messy and take longer to analyze.
I’m really looking for a way to log data in real time so that it can be used in a web-app that takes data from a spreadsheet, syncing at certain time intervals. That or an cloud API that will take the published data directly into a visual so that the user isn’t just staring at numbers. My original thought was to have each device’s ID associated with a GPS location and then as data readings came through the cloud, a kind of heat map could be made using the quantitative data from each device. A site I looked at, Carto DB seemed to have capabilities for this but either I’m wrong or I just don’t have the knowledge to implement it.
I’m now realizing that pushing the data from the core to the cloud and getting involved with servers and API’s is a whole different animal compared to the relative simplicity of the spark firmware.

Hi @mwbrady

I think that web hooks will be the way to go for you. Here’s the doc:

This will allow you to post your data to a database which your web app would then query.

1 Like

Do you know of any sites with good tutorials/documentation on webhooks in conjunction with databases that might be helpful or are there some novice-friendly web app development sites that I could start digging through? I think I understand the basic use of webhooks but going out and implementing them with a database and app is just beyond the edge of my coding experience.

I’m trying to create a product that publishes sensor data to a database like PubNub that has a wide range of opportunities for data manipulation and analysis, but I am having trouble figuring out how to get this working with the spark core. I know this can be done with an arduino and a wifi shield and there are tutorials set up on PubNub’s site and other DB sites on how to do this, but I am somewhat stubbornly trying to get this working with the spark core instead. For one the core is a lot less expensive and smaller than the arduino and the firmware seems to be simpler because it already has internet built into it. Anyone have suggestions of how to do this before I switch platforms?


Here is an example of something that you can use as a starting point. Please see the link for Logging temperature data using the Spark Core, This uses ubidots which is a cloud service for storing data from sensors. Though I don’t know how pubNub compares with ubidots, it seems to be a good place to start with. The project measures temperature and posts it to the ubidots web page.

Keep in mind that the project above uses the HTTP library, rather than the newer webhooks. Might be worthwhile giving both a shot to see what you prefer. The Particle docs that have been linked to, contain a webhook example as well. Depending on the service you’re trying to use, the API endpoints, and the way you structure your data might have to be edited. The principle behind it stays the same though, so do try the docs example :smile: