Issues with multiple instances of Spark.subscribe( ) for Webhooks

To save you some reading :wink:

and the following post of the OP.

But it would still be interesting if @Dave found another solution.

3 Likes

I’m still tracking this down, sorry about the lack of news! You can also watch the responses from your hooks with the CLI with a quick spark subscribe mine :slight_smile:

Thanks,
David

2 Likes

@dave when monitoring the webhook response through dashboard/curl/node.js it just sometimes does not come through.
I see my webserver giving an response when the core publishes towards the webhook. But somewhere somehow the response is just not received or handled by the Spark Cloud.

How may I be able to monitor this better than through dashboard/curl/node.js

Regards,
Bart

Hi @bartjenniskens,

Sorry about the slow reply, so much going on! :slight_smile:

If your webserver doesn’t have any body in the http response, then the webhook won’t publish a response event. Can you make sure your server has a non-empty response?

Thanks,
David

1 Like

Hi all,

It took me a good two hours (edit: just realised how much time has gone!) to trim down my code to the bare webhooks, delete/readd webhooks, test, test, google, test, test, google. You get the idea.

This is a definite bug, I can replicate it and I know via particle CLI that the hook-responses of both have valid data. The first webhook subscribed to gets called, not the second, when you are publishing to both. You guys have pretty good support - though can I suggest someone updates the docs example to show the webhookHandler approach with a note there is a pending bug with multiple subscribe/publish pairs?

For the time being I’ll steal the webhookHandler idea.

1 Like

hate to ask, but has the behaviour suddenly changed?

subscribing to hook-response works, ‘hook-response/’ doesn’t appear to work anymore.