Hi everyone and thank you
Spark Particle for a great product.
I am currently working on a project where the photon is connected to multiple sensors and actuators, and I am using the Particle.publish() method to send sensor data at a predetermined interval. I’ve also implemented a solution where data is written to files on a memory card when internet is unavailable (I estimate that a day will generate about 15kb of data) and this is where our problem is. When we notice that internet connectivity is available again we would like to send the stored data to Azure (as fast as possible). What strategy would be the best solution for this?
Due to the limitations of Particle.publish(), once per second and burst of four per second combined with the STRING variable size limitation of 622 bytes, this doesn’t seem like an appropriate solution due to the amount of time it would take to publish a day, a week or even a month worth of data.
All suggestions are greatly appreciated!