Sending large amounts of data to Azure IoT Hub

Hi everyone and thank you Spark Particle for a great product.

I am currently working on a project where the photon is connected to multiple sensors and actuators, and I am using the Particle.publish() method to send sensor data at a predetermined interval. I’ve also implemented a solution where data is written to files on a memory card when internet is unavailable (I estimate that a day will generate about 15kb of data) and this is where our problem is. When we notice that internet connectivity is available again we would like to send the stored data to Azure (as fast as possible). What strategy would be the best solution for this?

Due to the limitations of Particle.publish(), once per second and burst of four per second combined with the STRING variable size limitation of 622 bytes, this doesn’t seem like an appropriate solution due to the amount of time it would take to publish a day, a week or even a month worth of data.

All suggestions are greatly appreciated!

The limit for a publish is 255 bytes so if it was doing nothing else you could publish a month in about 30 minutes in theory?
I guess you could bypass particle cloud and upload it as a file using https, and then have an app the other end that fires the contents at the IOT Hub or directly into your storage?
I have no idea what the restrictions are in that case.
I guess equally it could fire them a line at a time to a php script or something

Why not just send the data in 255-byte chunks until all the memory card data has been sent?

Both legitimate suggestions. However by sending all data (in 255-byte chunks) the Photon would be busy doing just that while not being able to continue gather data from the sensors. This will need to be done atleast once every 30-300 seconds (the exact minimum time between reads is not yet decided).

From the information I’ve been able to gather here on the forums, the https library cannot accommdate my needs as of now. MQTT might just be the way to go. Any thoughts?

That depends how fast you need to read your sensors and how long your "predetermined interval" is, since a single publishing action will happen in a split second and in between two publishes you can still do other jobs and you don't really need to hammer them out at max rate and all at once.
You can even have the publishing happen asyncronously by default checking for any unpublished data in your backlog.
So with your 30-300sec schedule I wouldn't see why that couldn't be dealt with as long your WiFi connection is availably the majority of time.

MQTT is available but only unencrypted (for now) between your device and the broker.

1 Like