Hello,
Was very excited about the Google Cloud Platform (GCP) integration. So much so that I immediately started using it for my projects.
Not sure how many people use it to, or have experience with GCP but wanted some advice on implementing this so it can scale.
My current implementation is like so:
https://docs.particle.io/tutorials/integrations/google-cloud-platform/#example-use-cases
… so I spin up a GCP “Compute Engine” just to run the node.js script that listens for the PubSub events and saves it to DataStore as I could not find a managed service that could do this.
Now because I want it to scale, ideally this node.js script should run on a managed service that can respond to spikes automatically. But GCP does not seem to have anything like this.
In AWS I could so this:
Sensor Data -> Particle WebHook -> AWS API Gateway Endpoint -> AWS Lambda -> AWS DynamoDB
All the AWS points are managed.
What’s the best way to have that node.js script always running in a managed way on GCP?
Thanks very much,
Mark