Sending data to google sheets using IFTTT

Hey, I am trying to send data to google sheets using webhook. I have setup an IFTTT particle plugin which receives data from event in JSON format and sends it to the sheets. It is not parsing the JSON, rather adding the whole payload in a cell. It is like,

{"sensorOne": 0.2, "sensorTwo": 0.2, "sensorThree": 0.0, "sensorFour": 0.0, "sensorFive": 0.0}.

I want each of them to be a column and then its data, just like Sensor One | 0.2 and so on.

Can you please suggest how to do it, and if there’s a way to do it without the IFTTT.

Secondly, In the IFTTT plugin we have to select a particular device from which the event is coming, but in my use case there are many devices. How do I address this ?

Last question, My end goal is to create report in google data studio, so is there a better way to get data there apart from Google sheets.

Thanks .

I would not use IFTTT to get data into Google sheets, it’s slow and I have found it to be unreliable.

This method is harder to set up initially, but works great:

https://docs.particle.io/datasheets/app-notes/an011-publish-to-google-sheets/

For larger quantities of data than you can reasonably put in a spreadsheet I’d use this technique instead:

https://github.com/rickkas7/google_cloud_tutorial#google-cloud-functions-to-datastore-example

That particular example uses Google Cloud Datastore, which I’m not sure works with Data Studio. However, it’s a minor code change to switch to storing in Google Cloud SQL instead of Google Cloud Datastore, and Cloud SQL is definitely supported by Data Studio.

This is even more work to set up initially, but once you get it working this would be the best way. It’s much more scaleable and once the data is in Cloud SQL you can do all sorts of things with it.

Hi,

I could not find Google Firebase as a data source to Data Studio, but maybe you can get inspiration from this Particle-Google integration and from the Pub/Sub send the data to a place/db where Data Studio can pick it up later?
That way you wouldn’t use spreadsheets.

Hey, the solution that was suggested here, works great for me. Right now I am able to send data to CloudSQL from pubsub using cloud function.

I need suggestions on another topic but related to google cloud and processing only. The data stream is currently sending raw current reading to the cloud, that is like;

{ 
sensorOne(Amps),
sensorTwo(Amps),
timestamp,
device_id
}

In Ubidots it is easy to calculate Energy used every hour by averaging Power in kW. As it create variables at device level. But it is getting difficult for me to do this in google cloud. Can someone please give me some suggestions for what I can do to get this done.

I have had some ideas, but not sure if this is feasible and I am open to suggestions.

  • Process data in batches by querying it from SQL on a cloud function and then calculating kWh for each hour for every device.
  • Use pubsub to send data to bigTable using dataflow. I saw there are some transformation models in it, but not sure on how to use it.

I am completely new to google cloud, and am learning it by doing. So If I made some stupid assumptions or points here please forgive me for that. :sweat_smile:

Hannan