[Solved] Long run datalogger

Hello everyone !

I require your help about a datalogger i am making with a Photon for an intern :

This datalogger measures some sensor values 10 times per hour (one measure published every 6 minutes then).
The specifications say that the console log will be used to visualize the last sensor measeurement via the Particle.publish() command.

But the console only displays the event that ARE published, not WERE published when the console was not opened.

i saw in the docs ( https://docs.particle.io/reference/firmware/electron/#particle-publish- ) :

Calling Particle.publish() when the device is not connected to the cloud will not result in an event being published. This is indicated by the return success code of false.

But my events are actually published, since my Photon is connected to the cloud, the problem is just that my console is not opened when the last publication is posted.

So, my question is : Can i display the last event (or events) published (when my console was not opened) when i open my console log ?

Your answers will help me a lot...Thank you !

Nope there’s no caching of published events on the :cloud: side.

You will need to store it elsewhere.

I thought to publish the measure every 5 seconds but save it every 6 minutes…

I saw the command getEventStream , but i do not think this will be adapted

However, is there a way to detect when the console is open ? so i can load the last sensor’s reading when i open it.

Send the sensor data to Ubidots and it will be stored and then you can create a dashboard and display that data in graphs.

So i discussed with my boss, he decided that we will use a Google Drive sheet to datalog every data : http://www.instructables.com/id/Datalogging-with-Spark-Core-Google-Drive/

Thanks anyway for the answers that could help the future users.

3 Likes