High-density Data to Google Cloud Platform

I am using a Particle Photon to create a device that takes sensor readings between 10 and 100 times per second for up to 30 minutes, and I would like to be able to store this data on Google Cloud Platform. I have followed the Particle tutorials to create a webhook that uses Particle.publish() to publish a single sensor reading and brings it into Google Cloud Platform through Google Pub/Sub. This is working fine. However, I understand that I can only publish a piece of data about once per second using Particle.publish().

What would be the best way for me to get this high-density data into Google Cloud Platform? Thank you in advance for your help!

I don’t have experience with the Google pub/sub system so I don’t know whether it is possible to get data to it other than through the Particle.publish route.

If that is the only way to get the data in, I don’t know if 10-100 times per second would be doable with the limitations of Particle.publish(); it depends on what your data looks like and whether you can combine data points and disaggregate them on the Google side. The data for a Particle.publish can be 255 bytes long, so if each data point is less than 25 bytes, you should be able to send 10 points per second by combining the data. Getting up to 100 points/second would require some type of data compression, like sending only the difference between readings or sending repeat readings as a single reading plus the number of times it is repeated. It would be helpful to know what your data looks like to advise you further.

1 Like