Why Google Cloud and Azure but no AWS?

I noticed a brand new Azure webhook integration. Nice.

I heard that AWS is larger than it’s 10 next largest competitors combined. Is this true? If so, wouldn’t it make a lot more sense to make a Particle webhook integration for AWS?

Let me speculate causes:
I just spent a very long time setting up Particle webhook > AWS API Gateway > AWS DynamoDB. It was kind of difficult. Further, I think DynamoDB is the wrong choice because I am not reading back one datapoint at a time, I am reading back several. This means I have to buy a lot of read capacity right? Or maybe I can write a bunch of lambda functions to keep all the data in one key.

I think I am reinventing the wheel. I think AWS DynamoDB and AWS RDS are too general and do not support IoT streaming as easily as Google Cloud pub/sub or Azure IoT. Am I right?

Stay tuned :slight_smile:

Based on my research Microsoft has a larger world wide server network than all the other companies due to being around for so much longer.

Azure IOT Hub > Azure Streaming Analytics > Azure Table DB > and Power BI for visualizing your data.

AMAZON is a big player also but for me personally they are not any key features that made me want to go in their direction.

@harrisonhjones It’s nice to hear this Amazon integration is coming since it will allow us to easily test all the different options avaliable if we want.

Any news on this? :slight_smile:

I would be interested as well in an AWS tutorial. All I could find online were very old tutorials when API Gateway and Lambda were much easier to use products.

I recently wrote about my experience getting Particle data into the big-3 cloud platforms. GCP didn’t work, AWS was close, Azure was super easy. Feel free to read it here: https://www.gordonstrodel.com/btbc/2018/1/29/cloudy-with-a-chance-of-iot-data

AWS is my preferred cloud vendor for compute and other resources. I’d like to say the same for IoT. However, Azure has made it much, much simpler!

And, Power BI is so…Microsoft. Throw that data into a SQL server and build a data viz in Tableau! Check out this example: https://public.tableau.com/profile/gordon.strodel#!/vizhome/IOTDatafromAzure/Dashboard

Enjoy!
Gordon

2 Likes

Thanks, I had just tried Azures IoT EventHub, it was recieving the messages. But I couldn't get much further than there. Couldn't seem to get the data/message payload into a DB or into Power BI... I just gave up after a few hours of trying and finding endless out of date articles!

Any chance @macyankee86 you have a little more info on how to got data out of EventHub into SQL? That is the part that is messing with me.. even setting up routes/inputs/outputs - it was all really vague...

Would appreciate any pointers you may have :slight_smile:

Happy to hear! I gave a shot to AWS and then ended sticking with Azure because I was impressed with the potential of powerBi - but I've found it so Microsoft, like you say, but I stayed in Azure nonetheless...
And thank you for the cloudy article.

Hey Cameron, can you describe your issue a bit more? I fought to some extent with Azure at the beginning but I have that data going into a sql db in azure like @macyankee86 describes in his article:


(no tableau for me for now)
Looking forward to helping you fix your issue...

Gustavo.

2 Likes

Hi @Cameron -
Yes, happy to provide more details. Give me a bit to write it up and I’ll post back here.
Stay tuned!
Gordon

1 Like

@Cameron and @gusgonnet

Check this out for more details: https://www.gordonstrodel.com/btbc/2018/2/16/azure-saving-iot-data-into-sql-server

Let me know if you have any questions!
Gordon

2 Likes

Thank you very much @macyankee86 for this! It worked perfectly. (after I removed the IOTHub as Raw data bit - not sure what that was doing :smile: )
The not-quite SQL language they are using was the problem.

Really good and thorough instructions - thank you.

I've now got the data going into the table but it is appending as a new row each time, the trick now is to figure out how to replace the existing row rather than create a new one. Not having much luck there... I would normally use a REPLACE in SQL, but it doesn't seem to support that capability.

I was thinking table triggers, but that just complicates things.

Any chance you have had any luck with updating existing rows with the latest data vs. new rows each time?

We get updates every 15seconds - we'd very quickly end up with a very big data set across so many devices.

1 Like

Thank you - very kind also! :slight_smile:
Turns out the trouble I was having after looking at @macyankee86's reference was the SQL type code.

I managed to get it working, but am having a lot of trouble getting it to replace existing rows (we want to have 1 row per unique device ID updated each time, instead of creating new ones ) If you have any advice on how to do a replace/update with Stream Analytics' weird form of SQL - I'd greatly appreciate any advice!

1 Like

ohhhh yeah, I was hitting my head against the wall with that part of the integration for few days. I fear everytime I have to modify that part of the code, like adding a new field.

I think @macyankee86 nailed it on his article, thanks man for a nice and working reference!

I needed to add a new row per hit on my application, but would it help in your case to delete the previous record then add a new one?
I think this will require you to use an Azure function as output of the stream analytics, instead of the DB output that you might have currently in place.
The azure function (I imagine - I do not know for sure) would receive the data and would delete current record and create a new one. You could create this function in C# for instance.
I think that's what I would try...
Good luck,
Gustavo.

2 Likes

Good thinking, I will give functions a shot :slight_smile:

1 Like

@gusgonnet - You are welcome for the article. Glad it was helpful. To be fair, I followed a lot from this guy: http://gunnarpeipman.com/2016/02/beer-iot-using-stream-analytics-to-save-data-from-iot-hub-to-sql-database/

One other idea for your “most recent record” concept: (and this is coming from my background in SQL databases)

  1. Create a column (date time) for the Published Date of the Value (my original query has it as VARCHAR which is string field – not sortable)
  2. Create a SQL VIEW to pull the MAX(Published Date) GROUP BY ID or DEVICE
  3. Use the SQL view in any webapp or reporting

Benefits: You save all of the trending data in the database for time series analysis and don’t have the added complexity of a Azure Function.

Cons: It’s more of a database-driven solution. If you don’t know SQL, this is SQL 102-103 type stuff so there will be learning curve.

Last thought: You can always use SQL Management Studio to administer the database. It’s a local software that connects to the cloud and is better suited for database admin.

Good luck! Respond with any questions.
Gordon

1 Like

I agree with this being a benefit
thanks!
Gustavo.

1 Like

Thanks,

Having a look into this now.

Did you find any options for bypassing Stream Analytics - just did the research on cost and it appears to cost $129AUD/month to use a streaming unit (unsure on how many of those I need).

The IoT Hub costing also adds up to $1-2 AUD per device, per month for a message per device every 15 seconds. It starts adding up quite quickly for low cost devices!

that's a very good question. No, I did not find alternatives, but I'd like to find out more if anyone has any.
Thank you,
Gustavo.

1 Like

I just found this article online: https://www.codeproject.com/Articles/1169531/Sending-events-from-Azure-Event-Hub-to-Azure-SQL-D

I’ve not had the chance to review the code or try it myself, but looks really promising.

Based on my early knowledge of Azure, IoT Hub should be functionally similar to Event Hub.

I don’t know how the Azure Function prices out compared with Stream Analytics. That needs to be explored further.

If you have a chance to try it, please reply here with what you find!
Gordon

I was successfully experimenting with << Azure IoT Hub -> Streaming Analytics -> Table Storage -> PowerBI >> for about a year - things were working great. Starting in January 2018, however, the cost for running a Streaming Analytics job on Azure almost quadrupled ($0.03/hr to $0.11/hr, CDN). I believe MS was trying to bring their pricing in-line with their competitors.

https://social.msdn.microsoft.com/Forums/en-US/5132271d-b338-4a86-88fd-7b5a845c9f9a/new-standard-streaming-pricing-model?forum=AzureStreamAnalytics

I was actually visiting these forums in search of an alternative (I understand one can store data in a “google sheets” spreadsheet…).

The key for me would be getting my particle data stored in a table that I could later easily access/download. Using Azure, I could store my data in table storage, and then use another program called Azure Storage Explorer to view, sort, and download the data.

1 Like

The pricing is much more suitable!

However, I wasn’t able to get it working with Webhooks - ran into issues with Auth. :frowning:
I did end up doing some more testing with the IoT Hub and found it was good.

However, my SQL database storage (for 1 message per device every 15 seconds) resulted in some pretty big datasets. When I looked into SQL server storage size pricing - it also wasn’t very favourable.

Have been trying with MQTT-TLS to get data into AWS. Now working, but damn does it use up all the flash storage!