Particle + Google Could Integration: Using Dataflow


#1

The tutorial that demonstrates the Particle integration with the Google Cloud Platform uses a node.js script that takes ingested data from the Pub/Sub topic, and stores it in Datastore. After reading through some of the Google documentation, its seems that you can do away with the node.js script, and instead use Google Dataflow to store the ingested data to Datastore. I’m just getting started on trying to implement this, and was wondering if anyone else has already tried it.

Thanks in advance.


#2

I would also love to know how to do this also.


#3

Hi bjagpal, Did you get anywhere with this?


#4

It’s also possible to use Google App Engine so everything is done within the Google cloud. That’s how I did it in my tutorial:


#5

@rickkas7 thank you for the reply, have you had chance to look at Dataflow though as another option?

Google recommends to use Dataflow for Real-Time Stream Processing for IoT…

https://cloud.google.com/solutions/architecture/real-time-stream-processing-iot

For my project I would like to follow a similar architecture Pub/Sub > Dataflow > BigTable

Great work on the Firebase Tutorial BTW :+1:


#6

No, unfortunately I did not. There wasn’t a lot of support available, and being a firmware guy - not IT - I didn’t have the time or desire to learn all the inside and outs of the Google platform. I am working with Azure for the time being and it seems to work well.


#7

ever figure this out? looking for some sort of guide documentation on this as well