One of my Photon projects gathers data from a number of different sources based on a schedule. It then knits them into JSON strings which are exposed by Particle Variables. The JSON is then downloaded, parsed and displayed via HTML on my computer and cell phone. When the Photon reboots, It would take a while for the data to repopulate, and that was annoying. I also wanted to log the data locally.
I recently came across a solution that doesn’t have a write-life, and that can handle a large data set … so this is how things work today:
When my Photon updates a Particle Variable, it now publishes a MQTT message
- where the MQTT topic is the Particle Variable’s name.
- where the MQTT message is the JSON string.
- where the MQTT retain flag was set true.
On a reboot.
- my Photon subscribes to the same MQTT topics that it publishes
- the MQTT broker automatically republishes the last message for each topic since the retain flags were set true.
- the Photon receives the messages, refreshes the Particle Variables, and unsubscribes before it resumes normal operation.
- an RPI on my home network subscribes to the same topics and logs them to a network drive.
Now, nothing is lost, I have logging, and I’ve added a control topic to make sure the Photon knows where it was in the collection schedule … in case of a long power outage.