Intergrating webhooks with gcloud Datastore

Hey all,

Right now I’m trying to set up a webhook that grabs data from Google Cloud’s Datastore. When I hit the “Test” button on the Webhook’s integration, I get a 401 response. I’m trying to extract data using the runQuery function using the http post method. I can run this locally without any problem. However, there is no clear place to insert the credentials. Where can I paste my private API key to get the webhook to run?

Request type: Post
Request Format: Web Form
Form Fields
partitionId: {projectId: [projectId]}
query: {kind: {name: recipes}}
Query Parameters
api_key: [private api key]
field1: [event name]
Is there anything that I’m missing/ misplacing? Thanks for the help in advance.

Edit: Fixed url, no longer getting 404 errors but 401.

http 401 error could be an unauthorization problem,

You’re right, I think that this is the problem. How should I include my private API key into my webhook? If I should not be using the API key, what other authentication method should I be using?

Could you pls make public your webhook from

Found this: Google Cloud Platform Integration SetupT
You should check again if your problem appears.

I suspect you will find it difficult to impossible to do this with a webhook. The correct way to do it seems unnecessarily complicated, but it offers much more flexibility and allows you to customize how the data is stored in the Google cloud datastore.

You’ll want to use the Google Cloud integration, not a plain webhook. This is kind of like a webhook, but handles the authentication part. It also translates the Particle events into Google pub/sub topics, which are the Google equivalent to events.

Then you use a Google cloud function to trigger off a pub/sub event and store this in cloud data store.

This tutorial may be helpful. Note that it shows two different ways of storing the data, cloud functions (the easier way) and app engine (the harder way, but was necessary before cloud function existed).


I agree, I was getting frustrated with the webhook approach and switched to the approach you’ve described with some positive results. I’ll be looking through your link, thanks for the tip

Hi rickkas,

Thanks for putting together these tutorials - they’re quite helpful.

On the topic of data retrieval from google cloud, do you know of any tutorials that illustrate the workflow for a particle to request data from a source like datastore (or through pub/sub + an intermediate google function or app engine script)?

The tutorial you linked to has been a great reference for publishing data from particle -> particle cloud -> google pub/sub -> an app -> datastore/firebase/etc. I’m curious if there’s any documentation out there that might show or explain the opposite path (data on datastore retrieved by an individual particle). Thanks for your help.

1 Like

I added another example to the Google Cloud Tutorial.

Datastore and Particle API using Cloud Functions

Like the previous examples, the device publishes some random JSON data every minute, like this:


This is stored in the Google cloud data store, along with a timestamp.

The difference is that after storing the data in the cloud data store, the cloud function uses Particle.publish to publish the latest data to all devices in the account.

While this example just shows publish, the same technique could be used to read from the cloud data store and publish data. Or read from the cloud data store and call a device function with data.