Google Cloud Data Storage + Particle Integration!

Thank you Particle for adding Google Cloud Data services as an Integrated data storage and processing solution!

I’m currently running Azure services which work reliably, but I’m anxious to see how much easier and cheaper Particle has made this backend by partnering with Google.

I’m setting up the Google account now to begin testing pricing at different update frequencies.

Microsoft Azure cost me $1.20 to receive and database 5366 - 250-byte webhook messages over a 24 hour period. That feels expensive to me so hopefully the Google solution reduces those cost. I’ll report back on my personal testing on this.

Does anybody recommend any nice graphing / visualization services that could use the stored data we push to Google Cloud?

Every day in every way Particle keeps getting better and better :smiley:

6 Likes

@jeiden I have a quick question on setting up the Node code for pushing data into the Google Cloud Database.

I have completed all the steps in the process but I’m getting stuck here.

Now the next step is this:

I created a Text file in the main root RWB directory called: gcp_private_key.json .

I then copied the private key we created into this gcp_private_key.json file that is now saved in the root RWB directly.

I then try to run this command but it’s not right so I’m not doing something correctly:

Am I supposed to create the gcp_private_key.json text file with the Private Key and then run the

touch gcp_private_key.json command in Node JS CMD window?

Almost there :smiley:

Thanks!

The command

touch gcp_private_key.json

is just to create a file called gcp_private_key.json, if you have already created and put your private key inside it (as you have mentioned) then proceed with the other steps.

1 Like

@Iv4n said it perfectly, the command is just to create the file! Go fourth to the next step!

2 Likes

@jeiden

I was able to run the last two steps successfully :smiley:

It looks like I’m getting an error message after authenticating with Google servers. See below:

I assume this has to do with an error in the var config file , see my setup below if that helps.

Here is a screenshot showing what I have called things in Google Cloud:

Here is what I put in the var config file :smile:

Does the error message help you pinpoint what exactly I am doing wrong?

Thanks for any and all help :smiley:

Also, does this setup mean that I have to have NODE running 24/7 on my laptop that’s connected to the web for the received webhooks to be sent into the Google Cloud Database?

Just a quick FYI... that command isn't available on windows as standard (but is on linux), hence the error. As others have indicated, it simply creates an empty file, so it isn't a hard problem to work around :wink:

Something like copy nul gcp_private_key.json would probably be the simplest alternative on a Windows box.

@pfeerick Thank you for pointing that out :smiley:

I was wondering why following fresh simple instructions was causing errors like this.

2 Likes

You’re not the only one. I find a lot of the tutorials seem to make assumptions about the environment the user is in, without ever specifying what those assumptions are - leaving me scratching my head and wondering what to do with a command like “touch gcp_private_key.json”.
Perhaps tags could be added, to indicate where/how these commands are used?

2 Likes

Yup, that’s a good suggestion :+1:
Maybe keeping things abstract (like “create a file named … containing …”) would be OS independent and people should know how to do that on their own machines.

We face the same thing the other way round. “Errors” are reported but the poster doesn’t give us the context (e.g. what OS, what firmware version, …)

1 Like

@ScruffR @jeiden Is this Node code that pushes data into the google datastore designed to run on my laptop using my internet connection to push every webhook received into the database? Or does this code just push the settings into google?

I’m just wondering how many request this Node code could handle.

I’m wondering how this compares to Azure event hubs that can handle up to 1 million events per second. I currently do not need 1 million events per second but as my product sales volume grows I need an infrastructure that can handle high volumes if it and when it happens.

I’m pretty sure you can host this Node code on a hosted server so I do not need to run it on my laptop but what can the code handle volume wise?

@RWB,

The tutuorial is really just meant to be a proof of concept. In reality, you’d probably want to host that code on a server so you don’t have to rely on your laptop. All matching events published by your Particle devices will be forwarded onto Google Pub/Sub, and then funneled into the Datastore DB from your Node script.

I don’t have exact numbers on rate/volume, but we integrated with Cloud Pub/Sub purposefully as this tool is meant to capture large numbers of events and store them reliably for up to 7 days for subscribers to ingest them. You shouldn’t run into issues as you scale, but if you do, please let us know.

I went ahead and removed the reference to touch in the Github README, thanks for pointing this out.

Did you figure out the error you were receiving? Looks like the crux of the issue is invalid_grant. Seems like it may be related to your private key file. Can you double check that everything looks good there? Did you ensure that you created a service account key with a role of Project editor?

2 Likes

Can we store data for longer than seven days, right? I'm sure you can but have to ask.

Have you guys done any testing as far as hosting the NODE script on the remote server and seeing how it handles multiple received webhooks at the same time? Is there anything going on to handle large volumes of data coming in from multiple devices at the same time?

Or is this just something where you expect each Photon or Electron to have their own database?

I have not figured out the problem yet. I did follow your instructions perfectly though so it's kinda frustrating.

I deleted the old private key and created a new key per the tutorial. I updated the key txt file also.

I get the same error message:

I have all the files in the same directory as shown in the screenshot below. Does that look right? :

Hey guys! I tried to follow the tutorial for the git repo and run into some errors too. When I try to “npm install” a bunch of warnings show up, and a different error shows when I try to “node tutorial.js”.

I’m using a macbook with macOS Sierra currently, npm and node versions are on the screenshots I took.

I think I remember getting those same errors before I had all the files in the same folder like shown in the screen shot below:

Did you run the NPM INSTALL function before running node tutorial.js ?

The npm install command downloads lots of files and takes awhile before completing.

Yep I’m having the same issues too. I’m stuck at the last part - node tutorial.js


Any help would be great thanks

@RWB that was it! I guess it didn’t download all the files on my first try, I did it again and everything went smooth, thanks for the help!

Sweet! Glad I can help out :smile:

So you did not get any errors after running the node tutorial.js ?

If not then could you take screen shots of your var config file vs what you had setup in the Google Cloud page? It would help me double check that I’m doing everything right on my end which I’m pretty sure I am doing :smiley:

7 days is just the buffer in the pubsub subscription - the message queue is not meant to be long term storage. Workers then pull these off for longer term storage into databases.

This is actually the reason why pubsub exists. The pubsub service can absorb huge loads of irregular traffic - millions of messages per second. How quickly you move them from the pubsub subscription to some other datastore will depend on how fast you need to drain the buffer, how fast it is re-filling, and how soon you need access to the information elsewhere. The standard pattern would be to scale out the number of copies of the puller on the same subscription - and the pubsub service will distribute the work.

I'm not exactly sure what might be happening there, do you have the gcloud CLI tool installed? Can you try gcloud init

which will authenticate the CLI and provide another layer of auth context.

@ptone Thanks for the info. It makes more sense now. It's sounds just like Azure Event Hubs.

On Azure, I have to use Stream Analytics to push data from the Event Hub into the database. This Node script is doing the same thing but hopefully at a lower cost :smiley: I'm about to find out if I can get this up and running.

I'm new to Node and the only time I use it is when following instructions to do things on the Particle CLI or when trying to follow this tutorial :smile:

No, I do not have gcloud installed.

If I do install it what will it allow me to do?

Were you able to complete this tutorial successfully?

The folks at Google Cloud are looking into the authentication error and will report back when they find something.

2 Likes