IoT backend suggestions

Anyone have suggestions on an IoT backend? I need to store events from my Photons. I have attempted AWS IoT it was a bust. Looking to see if there is a solution where I can use a stack DigitalOcean but haven’t found any tutorials…

Depending on how large you need to scale there are two options that work well for me.

  1. The easiest to setup is

  2. The more expensive but more robust solution that I’ve been testing for three weeks now publishing data into an Azure Table Database every 30 seconds from a Photon & Electron without fail is Microsoft Azure.

It costs me approx $1.50 per day to send 3300 full Particle Publish events to Azure Event Hub, and then I use Azure Stream Analytics to take that received data and push it into an Azure Table Database for long term storage.

The Azure Table Database has three weeks of data in it, and it has ringed up a cost of 9 cents.

1 Like

You could also use InfluxDB and Grafana to store and visualize your data. I have both running on a small linux pc, but they should also work on a raspberry pi.


Thanks @RWB, I am now trying I don’t have much data, for now. I will test it out for the next few days. And keep in mind the Azure database when it is time to scale.

Thank you @hl68fx, this looks like a viable option! I will try Ubidots and see if I can get up and running as soon as possible and keep this in mind for when it is time to scale.

I use AWS API Gateway, Lambda and DynamoDB for my sensor hub. I configure webhooks for events from my Photons and Electrons to send calls to my AWS API gateway. The gateway triggers a Lambda function. The Lambda function then writes the data to my DynamoDB table. It sounds complex, but its pretty easy.

This solution is cost effect (i.e cheap - free for small amounts of data), very fast, and scalable. I have over a 1.3 million records.

I then pull the data from my web app to display:

I would be glad to help with any code examples if you want them. It’s all written in Javascript/Node.


@brandongoode I’m using Microsoft Azure to do the same thing you are doing.

Azure Event Hubs > Stream Analytics > Table Database > visualize via Microsoft Power BI.

I can push approx six full webhooks per min, 8500+ webhooks per 24 hours for a cost of about $1.40 per day.

How does that compare with the pricing via how you are doing it? I know they have free tiers, but that is for personal usage situations and will not work for something you want to scale. Do you have any pricing examples you can share for pushing 8500 200 byte webhooks to AWS which get’s stored in a database?

I keep testing daily at different publish rates to see how the pricing changes as the update frequency increases.


Thank you for the response 3 days ago I stopped by an Amazon Pop Up Loft and they helped me set everything up! So I am now up and running, with one piece of data still not appearing. Hopefully, I will have it resolved once I look into it further.

Ubidots didn’t work for me. There is a bug that they haven’t fixed yet.

Curious of your opinion - now that I have the backend setup I would like to display the content using an Angular JS web app.

Do you have suggestions on a useful web app stack? For my use of the data it needs to be customized and it is not customer facing. So I will just run the web app using lite server (for example).

@dancingpearl The reason I chose to send data to Microsoft Azure backend is because of the ability to create custom graphs and data visualizations using Microsoft Power BI. I didn’t see anything nearly as flexible as Power BI on the AWS side.

Thank you for responding! I went to the Power BI site and it looks like the Angular JS app and would satisfy my needs.

I will wait to hear others experience with the cost of Amazon after the free tier is over to see how it compares with your solution.

Maintaining my own Angular app because I am using AWS, for what I am doing does not seem advantageous if Power BI is wrapped up in the $1.40 per day cost.

Here is an article comparing AWS to Azure IOT.

Both solutions cost you money in the end if your planning on scaling out your product.

1 Like

This is great! Thank you.

Wow! 8500 request per day – that’s a lot.

It would run about $0.20 assuming no discounts for processing the webhook (does not include the cost of front end).

Here’s my caclutations:

API GW Request 0.000003500
API GW Data 0.000018000
Lambda Request 0.000000200
Lambda Time (500ms @ 128MB) 0.000001040
DynamoDB Request 0.000000181
DynamoDB storage 0.000000050

One request 0.000022971
One day 0.195249722

1 Like

@dancingpearl I use Ember with for my front end. The charts are done with chartist. I’m looking at changing it to an Elm (because I like a good challenge)

1 Like

The 8500 request per day is just for testing to see what that cost.

Here is the cost for sending a webhook every 15 seconds, or 4 times per min using Azure.


The table database cost is next to nothing.

If your calculations are correct, it looks like AWS may be a better solution, but I’m not sure we’re comparing the two services correctly.

Here is the webhook that is being sent over, how does this compare to the amount of data in your cost breakdown for AWS?

{"data":"{ \"s\":\"wthr\", \"u\":\"%\",\"l\":\"Fishers, IN\",\"m\":\"Humidity\",\"o\":\"My_ORGANIZATION\",\"v\": 50.000000,\"d\":\"Particle Photon\" }","ttl":"60","published_at":"2016-09-25T00:48:48.914Z","coreid":"1a002d000347343339373536","name":"ConnectTheDots"} 

The main reason I was drawn to Azure was the Power BI app and how easy it is to create graphs and charts from the data in their databases. I do see that Power BI can access data from Amazon RedShift now.

I wonder how hard it would be to push the incoming data to Amazon Redshift?

I also wonder what your actual cost would be if you sent the exact same webhook at the same interval to AWS? Do you have the desire or ability to run a test like this so we can compare the 2 service cost?


The trick with AWS is figuring out what the cost will be when you exit the free teir. This makes it a pain to run a test and figure out the cost. The main issue with testing and getting a price is the amount of data you have to push. 1M plus requests to clear the free teir for each service.

The main cost is the data storage. If I use your exact data 266 chars and store 5366 requests per day for 3 years it would cost ~ $0.45 per day. If I remove the storage cost and just look at the transaction cost it only 2.6 cents per day.

I’ve never used Redshift, but it looks like you could just replace DynamoDB in my flow with Redshift. Looks like the starting price is $6 per day for 160GB of storage.

1 Like

Yea man it’s tuff.

It seems like the more data I push to Azure the cheaper it get’s per webhook received.

So I just keep raising the number of webhooks pushed to see how it affects pricing.

I think I need to get on the phone with somebody at Microsoft and ask them how to optimize this setup.

Thanks for your feedback, though.

From what you have posted it looks like AWS is cheaper.

1 Like

Thanks. I will take a look at Ember and chartist.

Hi Brandongoode,

Can you share how you got the webhook to call the aws API -> Lambda -> DynamoDB?


@Mkoder - I put together a quick demo using Node.js and Serverless.