Google Cloud Platform integration that can scale (node.js)

Hello,

Was very excited about the Google Cloud Platform (GCP) integration. So much so that I immediately started using it for my projects.

Not sure how many people use it to, or have experience with GCP but wanted some advice on implementing this so it can scale.

My current implementation is like so:
https://docs.particle.io/tutorials/integrations/google-cloud-platform/#example-use-cases

… so I spin up a GCP “Compute Engine” just to run the node.js script that listens for the PubSub events and saves it to DataStore as I could not find a managed service that could do this.

Now because I want it to scale, ideally this node.js script should run on a managed service that can respond to spikes automatically. But GCP does not seem to have anything like this.

In AWS I could so this:
Sensor Data -> Particle WebHook -> AWS API Gateway Endpoint -> AWS Lambda -> AWS DynamoDB

All the AWS points are managed.

What’s the best way to have that node.js script always running in a managed way on GCP?

Thanks very much,
Mark

Hi,

You can use Google Cloud Functions (somewhat similar to Amazon Lambda). Its currently in alpha but will soon be going beta and become available for everyone to use. I am currently using it, I have a cloud function written in node.js, listening to pub/sub topic. Whenever my device publishes an event, this function gets triggered. It basically reads the event data and saves it to Firebase real-time database. So far it has been running flawlessly. It is fully managed and can auto scale to handle any spikes.

Hope it solves your problem.

Thanks

1 Like