I noticed a brand new Azure webhook integration. Nice.
I heard that AWS is larger than it’s 10 next largest competitors combined. Is this true? If so, wouldn’t it make a lot more sense to make a Particle webhook integration for AWS?
Let me speculate causes:
I just spent a very long time setting up Particle webhook > AWS API Gateway > AWS DynamoDB. It was kind of difficult. Further, I think DynamoDB is the wrong choice because I am not reading back one datapoint at a time, I am reading back several. This means I have to buy a lot of read capacity right? Or maybe I can write a bunch of lambda functions to keep all the data in one key.
I think I am reinventing the wheel. I think AWS DynamoDB and AWS RDS are too general and do not support IoT streaming as easily as Google Cloud pub/sub or Azure IoT. Am I right?
Based on my research Microsoft has a larger world wide server network than all the other companies due to being around for so much longer.
Azure IOT Hub > Azure Streaming Analytics > Azure Table DB > and Power BI for visualizing your data.
AMAZON is a big player also but for me personally they are not any key features that made me want to go in their direction.
@harrisonhjones It’s nice to hear this Amazon integration is coming since it will allow us to easily test all the different options avaliable if we want.
I would be interested as well in an AWS tutorial. All I could find online were very old tutorials when API Gateway and Lambda were much easier to use products.
Thanks, I had just tried Azures IoT EventHub, it was recieving the messages. But I couldn't get much further than there. Couldn't seem to get the data/message payload into a DB or into Power BI... I just gave up after a few hours of trying and finding endless out of date articles!
Any chance @macyankee86 you have a little more info on how to got data out of EventHub into SQL? That is the part that is messing with me.. even setting up routes/inputs/outputs - it was all really vague...
Happy to hear! I gave a shot to AWS and then ended sticking with Azure because I was impressed with the potential of powerBi - but I've found it so Microsoft, like you say, but I stayed in Azure nonetheless...
And thank you for the cloudy article.
Hey Cameron, can you describe your issue a bit more? I fought to some extent with Azure at the beginning but I have that data going into a sql db in azure like @macyankee86 describes in his article:
Thank you very much @macyankee86 for this! It worked perfectly. (after I removed the IOTHub as Raw data bit - not sure what that was doing )
The not-quite SQL language they are using was the problem.
Really good and thorough instructions - thank you.
I've now got the data going into the table but it is appending as a new row each time, the trick now is to figure out how to replace the existing row rather than create a new one. Not having much luck there... I would normally use a REPLACE in SQL, but it doesn't seem to support that capability.
I was thinking table triggers, but that just complicates things.
Any chance you have had any luck with updating existing rows with the latest data vs. new rows each time?
We get updates every 15seconds - we'd very quickly end up with a very big data set across so many devices.
Thank you - very kind also!
Turns out the trouble I was having after looking at @macyankee86's reference was the SQL type code.
I managed to get it working, but am having a lot of trouble getting it to replace existing rows (we want to have 1 row per unique device ID updated each time, instead of creating new ones ) If you have any advice on how to do a replace/update with Stream Analytics' weird form of SQL - I'd greatly appreciate any advice!
ohhhh yeah, I was hitting my head against the wall with that part of the integration for few days. I fear everytime I have to modify that part of the code, like adding a new field.
I think @macyankee86 nailed it on his article, thanks man for a nice and working reference!
I needed to add a new row per hit on my application, but would it help in your case to delete the previous record then add a new one?
I think this will require you to use an Azure function as output of the stream analytics, instead of the DB output that you might have currently in place.
The azure function (I imagine - I do not know for sure) would receive the data and would delete current record and create a new one. You could create this function in C# for instance.
I think that's what I would try...
Good luck,
Gustavo.
One other idea for your “most recent record” concept: (and this is coming from my background in SQL databases)
Create a column (date time) for the Published Date of the Value (my original query has it as VARCHAR which is string field – not sortable)
Create a SQL VIEW to pull the MAX(Published Date) GROUP BY ID or DEVICE
Use the SQL view in any webapp or reporting
Benefits: You save all of the trending data in the database for time series analysis and don’t have the added complexity of a Azure Function.
Cons: It’s more of a database-driven solution. If you don’t know SQL, this is SQL 102-103 type stuff so there will be learning curve.
Last thought: You can always use SQL Management Studio to administer the database. It’s a local software that connects to the cloud and is better suited for database admin.
Did you find any options for bypassing Stream Analytics - just did the research on cost and it appears to cost $129AUD/month to use a streaming unit (unsure on how many of those I need).
The IoT Hub costing also adds up to $1-2 AUD per device, per month for a message per device every 15 seconds. It starts adding up quite quickly for low cost devices!
I was successfully experimenting with << Azure IoT Hub -> Streaming Analytics -> Table Storage -> PowerBI >> for about a year - things were working great. Starting in January 2018, however, the cost for running a Streaming Analytics job on Azure almost quadrupled ($0.03/hr to $0.11/hr, CDN). I believe MS was trying to bring their pricing in-line with their competitors.
I was actually visiting these forums in search of an alternative (I understand one can store data in a “google sheets” spreadsheet…).
The key for me would be getting my particle data stored in a table that I could later easily access/download. Using Azure, I could store my data in table storage, and then use another program called Azure Storage Explorer to view, sort, and download the data.
However, I wasn’t able to get it working with Webhooks - ran into issues with Auth.
I did end up doing some more testing with the IoT Hub and found it was good.
However, my SQL database storage (for 1 message per device every 15 seconds) resulted in some pretty big datasets. When I looked into SQL server storage size pricing - it also wasn’t very favourable.
Have been trying with MQTT-TLS to get data into AWS. Now working, but damn does it use up all the flash storage!