Webhook to MongoDB?

So its a debby downer about proper SSL to Azure IOT so the question is this… Is there a way to send data to webhooks, have the webhook compile a JSON packet based off of data received and then write that data to a MongoDB without using Node-Red or some other web service. I am trying to limit the number of points of failure.


Good question.

If you do not get the direct MongoDB connection via Webhooks figured out I can help you get your data pushed into a Microsoft Azure Database, but it will cost you some money for the Azure services.

Any reason you choose MongoDB over other services?

I was using IBM Bluemix to host a NodeRed service 24/7 to catch the Photon events and then push to MongoDB but for some reason it was not very reliable.

Problem with Node-Red modules is that its photon code specific. If I spin up another photon, I need another module. I would like something dynamic so when I add 1, 50, 2000 sensors, all I need to do is point to the same webhook rather than creating separate streams in Node-Red.

Right now at the office we have xBee sensors connected to a rPi3 connected to Azure IOT and that pushes to mongoDB.

Though the pricing model is horrendous for large numbers of sensors so the only special sauce Particle truly provides is the ability to update firmware remotely. Still looking into ESP8266 and CC3200.

You can do that with the Azure setup I’m using now.

You can setup an Azure Stream Analytics job to push all incoming webhooks from an Event Hub into a single Azure Table Database. The only requirement is that you’re sending the same data to the database from all sensors.


I have 1 Photon sending a webhook that is picked up by an Azure Event Hub > Streaming Analytics > Table Database.

I have 1 - Electron emitting a webhook that is picked up by an Azure Event Hub > Streaming Analytics > Table Database.

Each webhook received is databased along with the Device ID, Group Name, etc. so I can then go and query the database for exactly the device and data I’m looking for.

I could add 1000 more Photons and still push that data into the same single database.

Is that what you’re looking for?

What kind of update frequency are you needing?

Ultimate goal:

-Generic webhook to read Key value pairs from each photon like:


-Webhook parses this to a specific JSON string, timeStamp UTC, and insert to mongoDB tables located on mLab.

-Each photo will post data every 15-30 seconds

-Do this in a cost effective manner to bypass the huge monthly recurring costs. $250 a month for 1K devices is a lot especially if you sell the hardware without a service contract. Replicating all this on Azure is much cheaper but again, I lose the autoupdate for firmware. I guess I can do a work around and create multiple accounts for 25 each… but thats a pain in the ass…

If you do this with Azure like I am you can still do the firmware updates via Particle.

Some guys on here have built scripts to auto update firmware for a list of devices without having to do it manually or having to put them all under a product collection in Particle which I think is where you’re getting the $250 monthly price tag from.

Using Azure to database sensor readings four times a min will end up costing you a decent amount of money once you scale to more than just a few devices. The most expensive service using Azure to do this is the Stream Analytics service.

To Database 32,211 webhooks that are 200 bytes long into an Azure Table Database it cost me:

$8.26 for the Streaming Analytics which pushes the webhook data from the EventHub into the Database.
$5.00 approx for the Event Hub’s service which receives the webhooks sent from the Particle Cloud.
$0.05 for the Azure Table Database.

This is pricing based on a free account, not sure if it’s cheaper in a different grade of paid service.

It all adds up rather quickly.

What I like about Azure is that you can work with all this gathered data using Microsoft Power BI which is something Amazon AWS does not have, or if they do it’s not near as good.

Amazon AWS may be cheaper I’m not sure, but others on the forum are pushing data the same way to the AWS service. I’ll assume the pricing and cost are similar when compared to Azure, but I’m not sure.

If TLS was supported on the Photon could we just push directly into an AWS database?