I'm logging sensor data on the Boron. A sample consists of 90 individual readings taken 1 second apart, each reading is time stamped for a total of 1528 bytes per sample. I want to sample once per hour. But I only want to upload the data every 6 hours. I can store the data on the boron in FRAM. But I'm not sure how to get the data to Google Sheets. I want to use webhooks, if possible. But I'm not sure if 1. that's the best way to do it and 2. how to assemble the batches to work with webhook integrations and the receiving Google app script. I'm open to other ideas as well. Appreciate the community's ideas and help.
You will need something to unpack the data that you've batched up so each event contains multiple samples. And since the data will exceed the 1024 byte limit for each event, it will take multiple events. Be sure to take into account how many data operations it will take, because this will affect your cost.
The webhook itself cannot easily do that, but there are two common ways:
Using Logic on the Particle side: The logic block will be triggered by the event, and the Javascript code can expand this to write multiple rows to Google sheets. The traffic between logic and Google has fewer restrictions and does not count as data operations.
Using Google Cloud Functions on the Google side, you can do the unpacking there. The advantage of doing it there is that it can directly access the Google Sheets Javascript API, which is an easier way to manipulate the sheets than the REST API. The downside is that there may be a charge for this, but it will probably be small.
Thanks, Rick.
Last night I successfully created 5-event batches, using Particle webhooks, that I could unpack in Google Sheets. I handled all the JSON formatting in firmware, so the webhook custom JSON format was simply:
{{{PARTICLE_EVENT_VALUE}}}
However, each event was 98 bytes. I can optimize that down to 56 bytes. So, I can pack 10 events into a batch and not exceed the batch size limitations. That's still an improvement.
I will look at using Logic, though, because more than half of the bytes in an event are attributed to the JSON formatting stuff.
Logic is particularly useful for expanding JSON. While the input is JSON, you can reduce the key names down to a single letter to save space in the publish. The Logic block can expand the key names up to the full size you want to send up to the cloud service, and is not limited to a 1024 byte payload.
What if you get rid of JSON for device to Logic messages in favor of comma separated values or similar? Maybe that way you would be able to pack a lot more in one message.
Best,