Microsoft Partnership?

@bjagpal I think we need to ask when they expect to have the new Particle + Azure IOT integration up and running.

@Dave @jeiden Do you have any more info on when you expect to have the new Azure integration ready to test?

Azure is a very viable and powerful solution for the Particle Platform.

Hi folks,

Iā€™d like to ask ā€“ can you describe your use case and why using Azure Event Hubs is not sufficient? What about Azure IoT Hub makes it that much more compelling to you?

Thanks,

Jeff

@jeiden From what I have seen the Azure IOT hubs makes it easier to add / register new IOT devices ā€œPhotons & Electrons,ā€ via their IOT interface when compared to doing it the other way manually. This would make it much easier to manage a fleet of devices publishing to Azure.

Can you tell us from your point of view what you see as the Pros and Conā€™s of using Azure IOT Hubs vs. setting this up via Azure Event Hubs?

What has to be done on your end to make it easy for us to work with Azure IOT Hub?

The Azure IOT hub is from what I have seen easier for beginners to get up and running, plus it provides a way to visualize the incoming data live and over time.

We need a new updated tutorial on showing the path of least resistance for getting a Photon & Electron sending data to Azure Event Hubs at a minimum.

Weā€™re working with the Microsoft team currently to build out a first-class integration (like we recently launched with Google Cloud Platform, see www.particle.io/google). When we first announced our Microsoft partnership, IoT Suite didnā€™t exist, which is why we worked with Event Hubs, and we also didnā€™t have our integrations interface, which we now have. Weā€™re working now to figure out which Azure service(s) to integrate with, and feedback from the community would be super helpful.

The big thing that would help us is feedback on what you want to do with Azure. Data storage? Visualization? Machine learning?

I see three major options, with some notes below on the advantages and disadvantages of each approach:

Integrate with Microsoft Azure IoT Suite. Advantages: leverages some of the cool stuff that Microsoft is building into Azure IoT Suite. Sounds nice, because itā€™s IoT. Direct mapping of Particle devices to Azure IoT devices. Disadvantages: more expensive.

Integrate with Microsoft Event Hub. Advantages: Gets your data into Azure; if you want to use Azure for their other tools (not the IoT Suite stuff specifically, but their data storage/visualization tools), it would do all the same things as the IoT Suite integration. Cheaper. Disadvantages: you donā€™t get the features built into IoT Suite.

Integrate with other Azure services directly, like SQL Database. Advantages: Less setup for you to do if we integrate directly with the Azure tool you want to use. Cheapest. Disadvantages: Only works for you if we pick the Azure tool that you want to use. Weā€™d likely have to build multiple integrations, which will take longer to roll out.

Feedback would be wonderful!

@zach @jeiden @Dave

Thanks for the update and confirming this is being actively worked on.

For the people who are not needing to work with more than 100 devices and want a simple way to visualize the data live and over time it looks like the Azure IoT suite is best suited for them. Itā€™s free up to a certain point.

For the people who want to create a product using the Electron or Photon and sell more than 100 units then the Azure Event Hub service is a robust solution that can handle up to 1 million incoming events every second.

Here is how I want to use Azure + Particleā€™s Photon & Electron products together to make a killer new product.

  1. My Product will transmit system status info over Wifi or Cellular every 1-10 mins.

  2. Azure Event Hubs receives the system status message and holds it in queue for an Azure Stream Analytics Job.

  3. I have created an Azure Steam Analytics Job to take the incoming data in JSON format and push it into an Azure Table Database. I have one Azure Table Database that holds all the sensor data received and each product has a unique serial number added to each table database entry so itā€™s easy to search.

  4. Personally I have no need or desire to use the Azure IoT Suite online web interface features because I prefer to use Microsoft Power BI to visualize all the data I have pushed into the Azure Table Database.

Microsoft Power BI has a desktop application that is free that allows you to create just about any custom data visualization that you could dream of by using a familiar Windows Office-like applications that PC users are familiar with. Microsoft Power BI makes it easy to pull data from Azure databases which makeā€™s things easier on you when it comes to creating visualizations to make sense of the data you collected.

Microsoft Power BI has teamed up with PubNub to allow their clients incoming data streams to be viewed on a custom Power BI Dashboard in real time as they come in. This real-time data view on the Power BI dashboards is a key feature we need to get working with Particle devices in my opinion

Here is the article explaining how to setup a PubNub stream to show up live on a Microsoft Power BI dashboard.

The Power BI team is working on adding a way to view incoming live data just like you can with PubNub by creating another stream analytics function that pushes the incoming data to a live auto-updating dashboard widget that looks the same as the PubNub example above. Normally the quickest automatic update rate for data in the Power BI dashboards is every 15 mins or by refreshing the web page manually.

In this recent forum post @bjagpal was also looking for a way to show live data in a Power BI dashboard tile and created a post over on the Microsoft Power BI forum asking them about how to accomplish this.

See his post via this link: https://community.powerbi.com/t5/Integrations-with-Files-and/stream-analytics-as-streaming-dataset-source/m-p/68634/highlight/true#M5372

Here is the post on the forum where @bjagpal was talking about his same real-time dashboard tile issue. Microsoft Power BI + Particle = Awesome Custom Dashboards!

Once you create a custom dashboard, you can share it with the world via a slew of different ways provided by the Microsoft Power BI team. The Microsoft Power BI team is large and very active with constant improvements to the platform which makes me feel good as somebody who wants to use this as a reliable backend for my business and the clients who will use our products.

I have run some test to see what the cost is to send data to the Azure Event Hubs > Azure Stream Analytics > Azure Table Database and it seems like the more data you send per hour, the less each message cost. I guess this has to do with how they bill you for the cloud compute time per hour. I would love to understand better how the cloud compute time works and how to maximize the service cost.

Here are the key things that I think you guys can do for us who just want a reliable and scalable way to store data being transmitted by Photons & Electrons into a database.

One - Make it as easy as possible to add/authenticate new Photons & Electons to the Azure Event Hub Network. From what I have seen the Azure IoT Hub has provided the easiest way to add/ delete / authorize / deauthorize the Particle devices that will be sending data to Azure Event Hubs. This is not as easy to do using the Azure Event Hub method.

Two - Work with Microsoft Power BI to allow data being sent to Azure Event Hubs to also show up in a real time Power BI dashboard tile the same as PubNub is able to do. This will allow us to create custom dashboards that show real-time data being sent from Particle Devices. With the team Microsoft Power BI has it would be something they can get working in a week with ease and this is already in the works anyways.

Three - Based on my testing the cost to send data to the Azure services break down like this from highest to lowest in cost:

Highest Cost Service = Stream Analytics - Pushing Incoming Data into Azure Table Database.

Second Highest Cost Service = Azure Event Hubs - Receives all incoming Data without missing anything.

Third much lower cost Service = Azure Table Database - Holds all incoming data for however long you set it to hold it. I set it to hold data for 90 days and delete any data that is older than 90 days.

It would be really nice if there was some cheaper way to take the incoming data received by Auzre Event Hubs and then push that data into some Azure Database that is also easy to access via Microsoft Power BI so we can create dashboards to visualize the collected data.

It seems that Azure Stream Analytics is just overly expensive for just pushing the incoming data into a database but maybe thatā€™s just the way it has to be if you want a service that can scale to handle up to 1 million incoming data streams per second. If you can eliminate or greatly reduce the cost that Azure Stream Analytics charges for pushing data into a database then that would be a really good thing for us small business owners & our clients who will have to cover these operating cost.

Sorry for the long winded response but you asked for some feedback, and I have some to give after playing around with this combo for a few weeks.

I have not found a better backend solution than Azure & Power BI for data visualization when it comes to providing a small business with an easy way to create a custom dashboard layout to fit perfectly with your custom product design and aesthetics.

Iā€™m looking forward to seeing how this all progresses :smiley:

Let me know if I can provide any more info.

5 Likes

Thanks for the super comprehensive reply! :slight_smile:

Thanks,
David

I should not have said that I would like to move things to Azureā€™s IoT hub; to be honest, I am not completely familiar with the differences between IoT and Event Hubs to make a determination as to which route would be most suitable. I appreciate that you guys have provided some details, and some pros and cons of each approach.

My specific application (as it pertains to the cloud) requires DATA STORAGE, and VISUALIZATION.

In the interest of furthering the discussion on which integration(s) would work best for the Particle community, here is a brief description of my application and how itā€™s currently implemented:

  • I have photons at different locations that periodically take sensor measurements.
  • These sensor measurements are sent to an Azure Event Hub.
  • I have two Streaming Analytics Services set up. One of them stores the data in Azure Table Storage. The other one forwards the data to Microsoft Power BI.
  • I have a dashboard setup in Power BI that provides visualization of the data via a website/mobile app. Alerts are also very easy to set up here.

As I mentioned before, this setup using Azure Event Hubs seems to be working fine now that I have everything in place. I donā€™t want/need to detail all the steps required to get to this point (ie. webhooks, ASA queries, datasets, etc.) but if an integration with Azureā€™s IoT hub would make it quicker and easier to manage devices, and store/visualize/ incoming data, I would be interested. At a comparable cost, of course.

The other relatively minor(!) issue I was dealing with was moving my setup from Azureā€™s classic portal, to their new portal. This is not urgent and I probably wonā€™t bother until I hear more on the Particle + MS integration. Iā€™m looking forward to it.

@bjagpal Did you end up getting the Real Time data tiles working in Power BI yet using a Stream Analytics function?

Well Iā€™m not sure if you would consider it real time, but my dashboard on Power BI does update every couple of minutes. I have it set to display the last reading, as well as a 24-hour graph. This uses a streaming analytics service in Azure (the link to the procedure is included in a couple of my previous posts).

There is another way to get data into Power BI using an integration they call a ā€œstreaming datasetā€. They are working on getting this to work with Azure, but this will still require a streaming analytic service to be running. The streaming dataset method makes initialization and the visualizations easier to setup in BI (Iā€™m sure there are other benefits as well).

@zach @jeiden @Dave

If you guys can use your connections at Microsoft to push forward the release of the "Streaming Dataset" feature for supporting real time data streams from Particle devices that would be great.

We basically want the same setup that they provided for PubNub streams so it's easy to add real time data tiles to your custom Power BI Dashboards.

Hi @RWB,

Thatā€™s interesting, Iā€™ve also noticed that the Stream processors on Azure are pretty expensive, but so useful for connecting bits together inside Azure. Weā€™ll chat internally about the possibility of an integration that looks like a streaming dataset to PowerBI, thanks!

Thanks,
David

@Dave The thing is with the PubNub setup there is no need for the Azure Streaming Analytics which makes it FREE. Having to add a Streaming Analytics function to push data to a live Power BI dashboard would easily cost you $1 a day per data stream which isnā€™t really cost effective. I was paying like $1 a day to just have a Stream Analytics function push 5000 received messages into a Azure Table Database which adds up really quick.

If there is a way to push the incoming data to a database without needing Stream Analytics that would cut the cost of the Azure service in half since the Azure Streaming Analytics is usually 50% or more of overall cost.

1 Like

Thanks for the awesome insights @RWB and @bjagpal. This helps give us a better sense of your use cases, so we can be sure to design the right integration. My follow up question is around devices in Azure.

@RWB you said:

Make it as easy as possible to add/authenticate new Photons & Electons to the Azure Event Hub Network. From what I have seen the Azure IoT Hub has provided the easiest way to add/ delete / authorize / deauthorize the Particle devices that will be sending data to Azure Event Hubs. This is not as easy to do using the Azure Event Hub method.

@bjagpal you said:

if an integration with Azure's IoT hub would make it quicker and easier to manage devices, and store/visualize/ incoming data, I would be interested.

My question is this: Is a device identity in Azure something of value to you?. One main difference between Azure IoT Hub and Azure Event Hubs is the presence of devices in Azure. From their documentation:

Every IoT hub has a device identity registry that stores information about the devices that are permitted to connect to the hub. Before a device can connect to a hub, there must be an entry for that device in the hub's device identity registry.

At a high level, the device identity registry is a REST-capable collection of device identity resources. When you add an entry to the registry, IoT Hub creates a set of per-device resources in the service such as the queue that contains in-flight cloud-to-device messages.

This would theoretically allow us to mirror Particle devices in Azure IoT Hub, and publish event data to the individual device's stream, instead of a general event stream. Now, correlation between an event and a device would be possible at a different stage, as each event is recorded with the device_id of the Particle device that was responsible for the event.

Thanks for helping us dig in to the details here!

P.S. for more details on the differences between Event Hubs and IoT Hub, you can check out: https://azure.microsoft.com/en-us/documentation/articles/iot-hub-compare-event-hubs/

1 Like

Can you help me understand this better please.

You have to register each Photon or Electron that will be sending data to Azure Event Hubs before it will work properly. You also have to register Photon or Electron devices when your using Azure IoT hub. The Azure IoT hub just has a nicer web based GUI for adding the Photon or Electron devices to the device identity registry.

If you guys can make registering new Photon & Electron Devices with Event Hub's that would be a good move.

I now see that the Azure IoT Hub allows you activate or deny access down to a specific device. Right now with Event Hub's I guess they are all using the same security token to allow them to send to Event Hubs.

I figured I can just use the Particle Cloud functions to send messages to the Particle Photon or Electron devices since I do not plan on using the Azure IoT Hub service but Event Hub's instead. Can't Particle Cloud functions be used to communicate the same as you can within Azure IoT Hub?

Hey @RWB,

My understanding is that Event Hubs is more of a general event ingestor, whereas IoT Hub allows you to define individual devices and publish to device-specific event streams. Thatā€™s the ā€œper-device identityā€ piece you referenced above. With Event Hubs, there would be no concept of a device in Azure, or need to register a Photon or Electron.

If we were to go with an IoT Hub integration, Particle would proactively register devices and publish to device-specific stream in Azure IoT for you, without the need to manually create them. The bi-directional communication (device-to-cloud and cloud-to-device) you mentioned at the end of your post is an interesting one. Realistically, weā€™d likely launch the integration as just having device-to-cloud messaging (getting events from Particle devices into Azure IoT Hub). In the future, however, it could be interesting to expose Particle Cloud functions, variables, etc. to easily allow you to interact with Particle devices directly from Azure IoT Hub (cloud-to-device).

Does this jive with your understanding, or is there something Iā€™m missing?

That makes sense. I can see the benefit of individual device identification and having the ability to reject devices if you desire for some reason.

That sounds great.

I think that is a big step forward and to keep things simple that should probably be your first step.

The only other thing I would want you guys to try to work on is the ability to view live data streams coming from the Particle devices in Microsoft Power BI the same as they have done for PubNub since that gives us flexibility when it comes to visualizing the data we are bringing in to Azure.

I agree 100% on that.

No that all makes sense to me.

Azure provides all the back-end that is needed to create a reliable and scaleable solution for a IoT connected product. It will be nice to have this integrated into Particle so it's much easier to get setup and working for us guys with less experience.

ā€œIs a device identity in Azure something of value to you?ā€

Based on what you have described, this is not terribly important. As you mentioned, correlation between an event and device can always be done in other ways, using the device_id parameter. However, if the IoT hub can set up individual event streams in a simple, almost automated fashion, then that would be a convenience and would spare us from having to do more processing/filtering downstream.

Also just to be clear, I am not looking for a direct connection between Particle and Power BI. It would be a nice feature, but I am OK with streaming data from my Photon -> Particle Cloud -> Azure Cloud -> Power BI. It does cost approximately $1.00/day for the necessary Azure streaming analytic, but based on your application and data requirements, you can route data from many devices (hundreds?) and still only pay the same $1.00/day (perhaps marginally more).

Finally, sending information from the Cloud -> Device would be cool, but one can do that from outside of Azure as well. I think you guys are on the right track by focussing on a device -> cloud integration first.

Good luck.

It seems like you understand their billing for Event Hubs and Stream Analytics better than I do.

I have seen increasing transmitting frequency having little impact on the cost to where it looks like the more data you send the cheaper each message sent gets.

Can you sum up in a easy to understand way how their compute unit billing for Event Hubs & Analytics Stream service works? I have a hard time wrapping my head around this.

Hi there, unfortunately I donā€™t know all the ins and outs of Azureā€™s billing policy, I can only really comment on what I have experienced so far. Also note that I am in Canada so the numbers I am using are in CDN funds (you can knock approximately 25% off of my numbers, to get an approximate USD cost).

It seems that each active event hub costs approximately $0.036/hour or $0.86/day. The hourly cost will remain the same regardless of how much data is ingested. Itā€™s almost like a flat rate. Iā€™ve found that the cost remains the same whether I have 1 device online, or 5. Now, I believe this ā€˜flat rateā€™ policy is valid as long as the number of messages that are being sent is below a certain monthly limit. I believe this limit is in the hundreds-of-thousands, if not millions. I will have to confirm this if/when I scale up.

It seems that each stream analytic services costs approximately $0.037/hour or about $0.88/day. The billing is done based on the hours that the stream analytic service is running. So even if you are not sending any data to Azure, you still still be charged $0.037/hour. If you have two stream analytic services running, the cost will be $0.037 x 2 = $0.074/hour.

Here is an example:
I had one event hub active to accept incoming data. I had a stream service setup to send incoming data to table storage, and I had another stream analytic service to forward the incoming data directly to Power BI. My hourly cost was therefore: $0.036 + $0.037 + $0.037 = $0.11/hour or $2.64/day.

This is pricey if you only have 1 device connected to Azure, however, if you have multiple devices, the daily cost remains the same - to a certain extent. I havenā€™t scaled up enough to see if/when the pricing starts to increase. Now keep in mind that if you are storing data to table storage, there is a cost associated with that as well - but it is negligible so far. As a final note, I am currently only sending 1000 messages a day into this system, from a total of 5 devices.

@bjagpal Thanks for the info.

The base rates are there weather you use the service or not while you have the Event Hubs and Stream Analytics running and online. My bill this month while not using it is $20 because the services are up and running and ready to accept incoming data.

I have done some pricing test also at different send rates in the USA.

The base Event Hub rate and Stream Analytics do increase the more you use them from what I can tell.

Unfortunately I just upgraded my laptop and forgot to backup the notes I had saved in OneNote that contained the results and cost of sending data at different rates to Azure. I was sending 8000 events per day and the cost does go up vs sending only a couple thousand events per day but the pricing per message received does go down at the higher send rates so itā€™s hard to really figure out what exactly is going on.