Thank you Particle for adding Google Cloud Data services as an Integrated data storage and processing solution!
I’m currently running Azure services which work reliably, but I’m anxious to see how much easier and cheaper Particle has made this backend by partnering with Google.
I’m setting up the Google account now to begin testing pricing at different update frequencies.
Microsoft Azure cost me $1.20 to receive and database 5366 - 250-byte webhook messages over a 24 hour period. That feels expensive to me so hopefully the Google solution reduces those cost. I’ll report back on my personal testing on this.
Does anybody recommend any nice graphing / visualization services that could use the stored data we push to Google Cloud?
Every day in every way Particle keeps getting better and better
is just to create a file called gcp_private_key.json, if you have already created and put your private key inside it (as you have mentioned) then proceed with the other steps.
Does the error message help you pinpoint what exactly I am doing wrong?
Thanks for any and all help
Also, does this setup mean that I have to have NODE running 24/7 on my laptop that’s connected to the web for the received webhooks to be sent into the Google Cloud Database?
Just a quick FYI... that command isn't available on windows as standard (but is on linux), hence the error. As others have indicated, it simply creates an empty file, so it isn't a hard problem to work around
Something like copy nul gcp_private_key.json would probably be the simplest alternative on a Windows box.
You’re not the only one. I find a lot of the tutorials seem to make assumptions about the environment the user is in, without ever specifying what those assumptions are - leaving me scratching my head and wondering what to do with a command like “touch gcp_private_key.json”.
Perhaps tags could be added, to indicate where/how these commands are used?
Yup, that’s a good suggestion
Maybe keeping things abstract (like “create a file named … containing …”) would be OS independent and people should know how to do that on their own machines.
We face the same thing the other way round. “Errors” are reported but the poster doesn’t give us the context (e.g. what OS, what firmware version, …)
@ScruffR@jeiden Is this Node code that pushes data into the google datastore designed to run on my laptop using my internet connection to push every webhook received into the database? Or does this code just push the settings into google?
I’m just wondering how many request this Node code could handle.
I’m wondering how this compares to Azure event hubs that can handle up to 1 million events per second. I currently do not need 1 million events per second but as my product sales volume grows I need an infrastructure that can handle high volumes if it and when it happens.
I’m pretty sure you can host this Node code on a hosted server so I do not need to run it on my laptop but what can the code handle volume wise?
The tutuorial is really just meant to be a proof of concept. In reality, you’d probably want to host that code on a server so you don’t have to rely on your laptop. All matching events published by your Particle devices will be forwarded onto Google Pub/Sub, and then funneled into the Datastore DB from your Node script.
I don’t have exact numbers on rate/volume, but we integrated with Cloud Pub/Sub purposefully as this tool is meant to capture large numbers of events and store them reliably for up to 7 days for subscribers to ingest them. You shouldn’t run into issues as you scale, but if you do, please let us know.
I went ahead and removed the reference to touch in the Github README, thanks for pointing this out.
Did you figure out the error you were receiving? Looks like the crux of the issue is invalid_grant. Seems like it may be related to your private key file. Can you double check that everything looks good there? Did you ensure that you created a service account key with a role of Project editor?
Can we store data for longer than seven days, right? I'm sure you can but have to ask.
Have you guys done any testing as far as hosting the NODE script on the remote server and seeing how it handles multiple received webhooks at the same time? Is there anything going on to handle large volumes of data coming in from multiple devices at the same time?
Or is this just something where you expect each Photon or Electron to have their own database?
I have not figured out the problem yet. I did follow your instructions perfectly though so it's kinda frustrating.
I deleted the old private key and created a new key per the tutorial. I updated the key txt file also.
Hey guys! I tried to follow the tutorial for the git repo and run into some errors too. When I try to “npm install” a bunch of warnings show up, and a different error shows when I try to “node tutorial.js”.
I’m using a macbook with macOS Sierra currently, npm and node versions are on the screenshots I took.
So you did not get any errors after running the node tutorial.js ?
If not then could you take screen shots of your var config file vs what you had setup in the Google Cloud page? It would help me double check that I’m doing everything right on my end which I’m pretty sure I am doing
7 days is just the buffer in the pubsub subscription - the message queue is not meant to be long term storage. Workers then pull these off for longer term storage into databases.
This is actually the reason why pubsub exists. The pubsub service can absorb huge loads of irregular traffic - millions of messages per second. How quickly you move them from the pubsub subscription to some other datastore will depend on how fast you need to drain the buffer, how fast it is re-filling, and how soon you need access to the information elsewhere. The standard pattern would be to scale out the number of copies of the puller on the same subscription - and the pubsub service will distribute the work.
I'm not exactly sure what might be happening there, do you have the gcloud CLI tool installed? Can you try gcloud init
which will authenticate the CLI and provide another layer of auth context.
@ptone Thanks for the info. It makes more sense now. It's sounds just like Azure Event Hubs.
On Azure, I have to use Stream Analytics to push data from the Event Hub into the database. This Node script is doing the same thing but hopefully at a lower cost I'm about to find out if I can get this up and running.
I'm new to Node and the only time I use it is when following instructions to do things on the Particle CLI or when trying to follow this tutorial
No, I do not have gcloud installed.
If I do install it what will it allow me to do?
Were you able to complete this tutorial successfully?