I am looking for some assistance with a particle.io project involving the photon2.
I know that what I am trying to do is very specific (and yet seemed so simple at the beginning!), so I am reaching out for support.
Here is the gist:
I would like to collect a couple streams of data locally (must be more than one, so the Particle.variable() will not work for my use case) from sensor inputs on my photon2 and format a string that POSTs a webhook. My understanding is that I need to set up the webhook integration in particle cloud in order to allow this (but maybe I just need to generate the webhook in the CLI and post using deviceOS?). Let's assume that I already know how to get the data in the particle workbench.
I would like to then inspect that the webhook is functioning, so I would like to do a GET in postman to see that the data is getting updated. This should look like a JSON response that has the sensor keys and values, then secondarily
send the data to a server (I will be using vercel, but I have also tried this in PhP and never got it working), then
do a POST req in postman to see what the server is getting the correct values, and finally
use the data for my web application by doing a POST req.
I think that the URL needs to have my device (and key?) in it, and the authorization type needs to be bearer token . I am sure there are many options. It would be nice to understand which authorizations should be used for a fleet vs a single device, etc. The documentation shows lots of different ways to do this. The docs use THINGSPEAK, which is quite different AFAICT. I don’t think postman bit in the docs covers a how-to on the topic(s), but I may be wrong.
Would anyone be willing to assist with these topics? One of the important bits for me personally is understanding more about the different routes I could go in order to arrive at the same results, so that I truly understand different ways of accomplishing similar results, but also understanding why certain approaches work better for specific use cases. To recap, I've reviewed a lot of the documentation on this and did not find what I was looking for. That said, if there are parts that anyone thinks I should reread in order to clarify my questions I'm of course up for that!
If I'm understanding correctly, you might be able to use Logic and Ledger together in order to trigger a custom webhook integration based on a stream of data.
To do this you would set up a Device to Cloud Ledger. Then connect your Ledger instance to a new Logic block that triggers a predefined Particle event. This event can then be connected to your custom webhook integration is pointed at an endpoint that is served by your Vercel server.
Some things to note: I don't believe a custom webhook integration is exposed to the broader internet. Therefore, I don't think you would be able to test the custom webhook integration with Postman. A Particle webhook would only get triggered when data is published to the corresponding event.
You can use the Particle Cloud API on your Vercel server to get information about your device. There's a post that outlines how to set up your authorization in order to interact with the Cloud API.
I may be misunderstanding your question. Can you give a specific example of what you're trying to achieve?
You will typically read your sensor data, then periodically use use Particle.publish() from your Photon 2 code. There are limits to the rate you can publish, however if you're looking at the order of minutes, you'll be probably be fine. You will also typically encode multiple sensor values into a single publish using JSON to save on data operations.
You can see these events by viewing the live event viewer in the console in the event tab. This only shows live events and does not show historical events, so you need to open it first.
A webhook listens for a published event name prefix, and then contacts an external web service by HTTP. While not as important for Wi-Fi, the reason this is not done directly from the device is that https (TLS/SSL) encryption requires a lot of data to set up the encrypted session. Also the code is very large, which is not an issue for the Photon 2, but was for older devices. Thus offloading it from the device to the server-based webhook makes sense. You can monitor this activity in the console as well by viewing the integration history. The last 10 requests and responses can be viewed.
Since you want to use your own server, you can implement your server to respond to the HTTP post requests from the webhook. Presumably your server will also handle requests from your web app, and server your web app, but at that point it will be standard web stuff, not Particle-related.
@rickkas7 thank you for your detailed reply.
If it's alright with you, I'm going to take this slow and clarify one step at a time.
what I'm hearing you say is that it's more robust/efficient to have the Particle.io cloud service handle the details of the webhook (i.e., what is to be sent) because of encryption requirements, rather than the device itself. The device publishes some JSON via Particle.publish() to Particle cloud, and the webhook configuration (authored in the user's Particle.io profile) is specified to handle the expected JSON accordingly. I believe it is this last part that I am having a hard time with. There are many fields in the webhook area and I am not clear on whether my use case requires a "custom template" or a "webhook builder". Currently I have chosen the builder. I have created a mock server in Postman to inspect that the JSON is actually getting to a destination so that I can send it to another destination. Does this sound right?
What is supposed to go into the URL field? The postman mock api URL, right? Something like https://xxx.mock.pstmn.io/<mywebhookeventname> ?
Or should it be the device URL as in https://console.particle.io/devices/<mydeviceid>/<mywebhookeventname> ? This technique is what I have used when attempting to get a variable or call a function, etc.
Do I have the event name correct, based on what I have in my publish code?
Does the request format being JSON make sense?
What about the extra settings? Should I be modifying those?
Thank you for this information. I was not aware of the Logic/Ledger models and I will explore them. I am hoping that I don't have to learn these in order to accomplish what I'd like to, but will look into them in the future.
What is supposed to go into the URL field? The postman mock api URL, right?
Yes, whatever you've set up your server to respond to, or you can use with Postman.
Do I have the event name correct, based on what I have in my publish code?
Yes
Does the request format being JSON make sense?
Yes
What about the extra settings? Should I be modifying those?
Generally no, but if your server required basic authentication or an authorization here, or a self-signed certificate, you might need to change some settings.
case requires a "custom template" or a "webhook builder"
The default format of the JSON POST body includes information about the device, including its Device ID, a time stamp, the event name, as well as the event data payload, as a string. If you want all of this information, you can use the default template.
If you want custom data, you can set up a custom JSON template so only the data you want is uploaded. A common use case is to use {{{PARTICLE_EVENT_VALUE}}} which would only send up the payload sent by the device.
@rickkas7 For the custom common use case bit you're describing, does {{{PARTICLE_EVENT_VALUE}}} cover both keys? Meaning, does this get replaced with the json (string)
{
"sensor1" : 0.9,
"sensor2" : 0.5
}
...or do I need to specify this with {{{two_sensorvals}}}, or something like {{{[sensor1, sensor2]}}} ?
Since you're getting a 404, it's not finding the endpoint.
In particle, make sure you have the url of your mocked server, including the webhook endpoint (though you may already be doing this).
So in my example, I just created one with a target endpoint of https://3b5c28b0-1752-4a44-87e1-f5f79b181772.mock.pstmn.io/webhooks/W123456, which returns a default OK. I wasn't able to change any of the responses from the Postman webhooks though, not sure why.
Thanks for your reply!
Unfortunately, I tried this to no avail.
BTW, what is the idea behind using the trailing "fake" directories?
Is it just a convention to not use the mock server root? Since it is a "mock" I figured it makes no difference between using https://xxx.mock.pstmn.io/webhooks/W123456 and https://xxx.mock.pstmn.io/, but I am curious if there is something I am misunderstanding there.
I'm not quite sure I understand what you mean by "fake" directories - Do you mean the webhook endpoint (webhooks/W123456)? The end to the URL is important because otherwise you're just asking the top level mock endpoint to accept a webhook and that isn't going to work. I believe that even though it is a "mock" it is an actual server hosted by Postman to handle mock data from users, making the URL important.
When I created the mock server it showed the default webhook name as W123456.
This may actually be easier if you start to implement your real webhook backend and start testing that way, the mock Postman server is honestly kind of confusing to me.
That definitely illuminates a misunderstanding I had, regarding the question of what postman is doing behind the scenes. Thanks!
Could you clarify what a 'webhook name' is? Is it just an endpoint name? When I created my mock server in postman, no such default was created. I'm wondering what we did differently.
Even though I'm attempting to make a webhook with the name two_sensorvals it ignores it and uses a default mock endpoint of ID W123456 with various predefined events in it. This is seen in the return.
Even if you change W123456 to be anything else, it still works. I think again this is just because the mock server is ignoring the content of webhook_id but rather just making sure it exists. If you delete the value for webhook_id it fails to call.
thank you so much for this thorough explanation. I think the pictures help a lot (for example, I didn't realize there was a collection of webhook examples that includes a tutorial that I can install if I search for it); sometimes it's difficult to explain these types of things with only text. I'll check into all details later tonight; I think this is probably the source of my issue.