Microsoft Partnership?

“The biggest question I have is if this new beta streaming tile setup requires you to setup and constantly run a separate Stream Analytics function for it to work? As you know, the Streaming Analytics cost add up quickly.”

  • Yes, the new streaming tile integration that I am talking about is designed specifically for use with an Azure Stream Analytic (ASA). This is similar to how there is a streaming tile integration designed specifically for PubNub.

“What do you mean when you say almost-realtime? Like it shows up within 1 min of the data being sent, or a few mins, or 15 mins?”

  • In terms of “almost-realtime”, my dashboard seems to update every 10 minutes and this is in line with how often my photons are sending information to Azure. Unfortunately, I’m not sure what (if any), lag there would be if you are sedning data at a much higher frequency.

“So all the data you have coming into Power BI is being sent in via an Azure Stream Analytics function? I’m trying to keep the Stream Analytic functions to a minimum to keep cost down since I plan on having hundreds of products pushing data in at 1 min intervals. If I can push the live data into the dashboard using the new beta, live dataset function which does not require a separate Stream Analytics function, and the cost that comes with running it 24/7 then monthly expenses to keep this alive will be lower by a good chunk.”

  • Yes, I am feeding all the data to Power BI through an ASA. The beta function I am talking about (“streaming dataset”) is just a different way to use Azure to send data to Power BI. And with it, comes the streaming tile function.
    I believe you are looking for a way to send data to Power BI directly from the Particle Cloud (bypassing Azure altogether); that is very different and I think that’s where things are being confused.

“You can see the Stream Analytics cost more than double what the regular Event Hub service cost and that is why I’m trying to keep them to a minimum”

-Don’t wan’t to get into the nitty gritty in terms of pricing, but generally from what I have found its about $1.00/day per event hub, $1.00/day per ASA. So if you are using an event hub, and an ASA for storing data, and another ASA for sending data to Power BI, your daily cost will be approx $3.00/day. Depending on how much data you send, you can have many many devices hooked up, and still maintain this same cost. You’ll have to refer to the Azure pricing calculator to see the details of how it all works.
As you mentioned, the IoT Hub is free if you plan on sending less than 8,000 messages a day, so you can save the $1.00/day associated with the event hub. I have yet to set up the IoT hub so we’ll see…

The PubNub integration does not require you to run or pay for a Stream Analytics function and that is why I'm interested in a similar setup for Particle Publish data that is pushed into a Azure IoT Event Hub.

Thanks for your other replies :thumbsup:

Let us know what you think about the Azure IoT Hub integration when you get a chance to set it up, it's much easier to work with.

@jeiden @zach @bjagpal I am talking with Sirui Sun who is with the Microsoft Power BI team.

I asked her about getting data from a Particle Publish to Azure IoT Hub to be usable as a Streaming Dataset in Power BI so the data updates in real time as soon as Azure IoT Hub receives the Particle Publish data. This is something they have already done with PubNub.

Here is the reply I received from Microsoft regarding doing this with Particle Publish events. Can you look it over and reply please. Without this data updates only happen every 15-10 mins.

The easiest way to get data into Power BI streaming dataset right now is to call our public REST APIs. That is actually what Azure Stream Analytics does today.

Roughly speaking, the steps would be to use the REST API to create a new streaming dataset (defaultMode:pushStreaming flag), and then use the REST APIs to start pushing rows of data into
that dataset.

Could Particle make those REST calls to Power BI?

Hey @RWB,

You can try to work with Power BI’s REST API (https://msdn.microsoft.com/en-us/library/mt147898.aspx) and create a Particle webhook to bypass Azure and go straight to your desired destination. Realistically, we are not planning on building a 1st class integration with Power BI anytime in the near future.

Thanks,

Jeff

2 Likes

@jeiden I now have the Azure IoT Hub incoming data being stored in a Azure Table Database after tweaking some settings that I used to do the same thing with regular Azure Event Hub’s.

Here is a picture of the Azure Storage Explorer PC application showing the data created for every Particle Publish event we sent to Azure IoT Hub. You can see all the data we are passing in is under the “data” column.

The Azure IoT Hub integration’s webhook is setup so the data is sent formatted like this:

If I wanted add separate “data1”: , “data2”: , “data3”: , to the Particle Publish event the same way we can create custom JSON Response Templates when creating Webhooks, is it possible with the Azure IoT Hub Integration also?

If I added these extra “data1”: lines under the current “data”: “70”, line then the info we send under data1 will be stored in a separate column in the Azure Table Database.

I’m not sure if I’m clear enough but let me know if not and I’ll say it another way.

I believe I have the same question. How can we send the data to the IoT Hub so that each data element gets it’s own column when pushed into an Azure storage table? (This isn’t just necessary for table storage, but for any stream analytic we implement).

Here is my firmware

float f = Temp1;
float g = Temp3;
char payload[255];

snprintf(payload, sizeof(payload), “{ “Outlet”: %f, “Inlet”: %f }”, g, f);
Particle.publish(“SendLoonIoT”, payload, PRIVATE);

And here is the resulting storage table:

@bjagpal @jeiden

We just need the ability to add our own custom JSON formatting to the data that is sent over to Azure. Right now the Azure IoT hub integration just gives us the “data”: field to send data over to the Azure IoT Hub and that whole field get’s entered into database as one column. We need to be able to customize the formatting of the data being sent to Azure IoT so we can have separate columns for each separate data variable we send over.

Here is how they formatted the webhook JSON data using regular Event Hubs in the Azure Weather Shield Example. This created separate columns for Subject, Unit of Measure, Measure Name, Value, Organization, Display Name, Location, Timecreated, and guid.

{

"event": "ConnectTheDots",

"url": "https://connectthedotsex-ns.servicebus.windows.net/ehdevices/messages",

"requestType": "POST",

"json": {

"subject": "{{s}}",

"unitofmeasure": "{{u}}",

"measurename": "{{m}}",

"value": "{{v}}",

"organization": "{{o}}",

"displayname": "{{d}}",

"location": "{{l}}",

"timecreated": "{{SPARK_PUBLISHED_AT}}",

 "guid": "{{SPARK_CORE_ID}}"
},

"azure_sas_token": {

"key_name": "D1",

"key": "mBLQGWxSkRHg7f2eRCLonHUpNS+DY0iPHclxjf7Cmvk="

},

"mydevices": true

}

Here is the link to the Particle Weather Sheild to Azure Event Hub tutorial: https://www.hackster.io/pjdecarlo/hands-on-lab-particle-photon-weather-station-in-azure-d89a03

When I look at the columns being sent over to the Azure Table Database it includes the following data once in the database. You can see the table columns are named based on the JSON formatting of the data were sending to Azure IoT Hub.

I think we just need the ability to customize the JSON Template just like we can with webhooks.

@jeiden Does this make sense?

1 Like

Makes total sense, this is good feedback. Exposing the ability to manipulate the JSON sent using custom templating like webhooks would certainly be useful. Will add to the list of improvements to consider

@jeiden @Zach Not trying to come across as pushy or ungrateful but this customizeable JSON feature is critical to getting data into Azure in a format that is most useable by the other services like databases & dashboards.

I currently can’t find a way to separate the all the different data in the single Data colum so it’s split into separate fields so it can be used properly.

I’m totally at your mercy when it comes to moving forward so let me know if you guys think this is something that could be added soon or not. If there is some sort of non elegant workaround for now I’m down for that also.

We’re so close to having a perfect Microsoft Azure IOT ingestion solution! This is what I’m going to build my products professional backend on.

Once the customizable JSON is ready I will put together a video showing how to setup the Streaming Analytics and database storage so you can archive all incoming data as desired.

Let me know if there is anything I can help with.

@jeiden @bjagpal @BenVollmer

I had an idea last night that may allow us to separate the data in the single Data payload into separate columns of data and then push that data to Azure Databases or Power BI.

I’m wondering if we can just add code to the Azure Stream Analytics function to separate the incoming Data Payload sent from the Particle Publish and parse the data into their own data columns so they can be pushed into Azure Databases & Power BI as needed.

I have no idea how to code this but I’m pretty sure we can do it.

Here is info on the Azure Stream Analytics Query Language: https://msdn.microsoft.com/en-us/library/azure/dn834998.aspx

If we could get a working template to do this then we would not need to make any changes to the current Particle Azure IoT Hub integration which I’m sure @jeiden wouldn’t mind :smiley:

1 Like

This is an interesting idea. But I’m wondering if the data payload doesn’t need to be ‘broken down’ prior to its arrival at Azure. Isn’t this what the webhook from the weather shield example did? I have written some Stream Analytics to do parsing, but I’m far from being an expert on writing querys. Maybe it can be done.

Based on what I read about Steam Analytics, the more complicated the query, the greater the ‘compute’ power it uses which ultimately can affect how much that ASA will cost you. I have no idea, however, whether simple parsing like this would even count towards this compute metric. So far all of my querys have cost me nothing except the flat hourly rate associated with the ASA.

Ha great @RWB,

That’s great to hear. I do think your custom JSON template is on my short list to add to the beta. It would be very valuable to provide this flexibility. Stay tuned for updates, but you should definitely try to unblock yourself without it.

Thanks,

Jeff

1 Like

@bjagpal The more I look at the Streaming Analytics Query examples the more confident I get that we should be able to do this easily once we have a template/ example to work with.

Do you mind sharing your parsing example code?

I’m just going to have to hit the search engines until I find somebody else who has done something similar if I can’t figure it out after looking at all the examples Azure provides.

For now see if this article provides what we need to strip out the data into separate data fields. https://blogs.msdn.microsoft.com/streamanalytics/2015/08/27/working-with-complex-data-types-in-azure-stream-analytics/

I also worry about increasing Stream Analytics cost by this extra processing but we will not know if it raises cost until we try it.

I’m not sure it will help, but here is my ASA query for taking certain pieces of data, from a specific device (“Boiler_955”), casting the values as integers, and then sending them to a Power BI output:

SELECT
CAST (timecreated AS datetime) as timecreated,
CAST (Inlet AS bigint) as Inlet,
CAST (Out AS bigint) as Out,
CAST (Ex AS bigint) as Ex,
CAST (Rm AS bigint) as Rm,
Name

INTO
toPowerBI

FROM
devicesinput TIMESTAMP by timecreated

WHERE
Name = ‘Boiler_955’

This query works with the weather shield example, where the data payload was separated into individual elements using a webhook, prior to being pushed to Azure.

1 Like

Hey @RWB,

I’m working on adding the ability to configure custom JSON when enabling the Azure IoT Hub integration. One thing that I’m realizing is that our documentation on custom templating when using custom JSON with integrations is very thin. Currently, the only docs that exist are here: https://docs.particle.io/guide/tools-and-features/webhooks/#webhook-variables

It includes references to {{PARTICLE_DEVICE_ID}}, {{PARTICLE_EVENT_NAME}}, {{PARTICLE_EVENT_VALUE}}, and {{PARTICLE_PUBLISHED_AT}}, but does not instruct users on how they can actually use dynamic key/values sent in the event firmware publish as you mentioned in your example.

If you have some extra cycles, I think your expertise on this would be very useful for the community. The link to edit these docs is here.

1 Like

@jeiden I’m just now slowly learning how to work with Webhooks and format them myself so I’m not a expert by any means.

Once you have the custom JSON added to the Azure Integration what I will do is make a tutorial on how to use the custom JSON template to format the data that gets sent out when you trigger the Azure Integration and I think this also applies to regular webhooks.

Then you or we can add that to the Particle Docs you linked to where they apply and you can clean it up as needed if needed.

It sounds like your not to far off from having the customizable JSON feature added to Azure so I’ll hold off on spending more time trying to figure this out via Azure Stream Analytic query functions which I’m sure would be the most expensive solution to this problem.

Hey @RWB,

Wanted to update you that today we released the ability to customize the JSON that gets sent to IoT Hub. When creating the integration, click on “Advanced Settings,” then put your desired JSON payload in the code editor. You can use all the normal templating variables discussed here. Happy building!

4 Likes

@jeiden Thank you and @particle2 very much for making this happen quickly!

Once I have this working with Azure > Stream Analytics > Table Database Storage I’ll create a guide to show others how to do the same thing as promised.

I’m excited this is all functional now :thumbsup:

That’s awesome @jeiden. This makes things WAY better.

1 Like

Just wondering if the Particle products will be added to Azure’s “Certified-for-IoT Devices” list…
https://catalog.azureiotsuite.com/

Not sure if the list is even up-to-date.