here’s the scenario.
I’m interested in using an outbound webhook to a service that requires the use of an auth endpoint to retrieve a access token as part of the returned http header, before being able to upload data to a different endpoint.
Is it possible to use particle.io’s cloud infrastructure directly to work with this sort of 2 step process? Or am I going to need to do something like write a small AWS lambda application to hide the complexity?
Unfortunately you’ll need to use Amazon lambda or a Google cloud function or similar to do that. A webhook cannot be triggered by the output of a previous webhook, so you can’t chain them together.
yeah that’s what it looked like. I already figured I can do it that way, just wanted to make sure I wasn’t missing something.
By the way, there are at least three ways you can trigger a cloud service:
- You accept requests on your server using http from a Particle webhook.
- For Google, using a Google Cloud integration allows you to securely map a Particle event into a Google Pub/Sub event, which can then trigger a cloud function.
- For Amazon EC2 and Google App Engine, you can have your server directly monitor the SSE event stream. This opens up a secure connection outbound from your instance to the Particle cloud and events are pushed down this when they occur. This is efficient and eliminates the need for a fixed IP address and SSL certificate.
Hi, another theory that I would try (if I did not want to get into lambdas and the sort):
- trigger the first webhook (with a publish), parse the response and extract the access token from it on my photon/electron/you name it
- trigger the second webhook with that token included in this second publish
I have never done this before, so maybe there are limitations that I’m not aware of.
That technique will work, but the most common problem with that is that if you have to put both the auth token and the data in the publish you run into the publish length limit.
However, in 0.8.0, the limit is 622 bytes instead of 255 bytes, so it’s more practical to do it that way now and it a good solution that does not require a server or cloud service.