Microsoft Partnership?

That's fine for what I'm trying to do. The biggest question I have is if this new beta streaming tile setup requires you to setup and constantly run a separate Stream Analytics function for it to work? As you know, the Streaming Analytics cost add up quickly.

I prefer to use Particle functions to send Alerts so not having these functions available via the live tiles is not a problem.

What do you mean when you say almost-realtime? Like it shows up within 1 min of the data being sent, or a few mins, or 15 mins?

So all the data you have coming into Power BI is being sent in via an Azure Stream Analytics function? I'm trying to keep the Stream Analytic functions to a minimum to keep cost down since I plan on having hundreds of products pushing data in at 1 min intervals.

If I can push the live data into the dashboard using the new beta, live dataset function which does not require a separate Stream Analytics function, and the cost that comes with running it 24/7 then monthly expenses to keep this alive will be lower by a good chunk.

That's good to know, I didn't know that. :smiley:

From what I have seen the Azure IoT Hub is FREE if you do not need to connect more than 250 different Photon's or Electrons & you do not need to send more than 8,000 publishes per day / 24 hours. Every day you can send data via a Particle.publish up to 8,000 times for FREE.

For $50 per month you can register unlimited Particle Photon's or Electron's and send up to 400,000 publishes per day / 24 hour period.

Per my test of regular publishing to Regular Event Hub & using Stream Analytics to push that data into an Azure Table Database here are my expenses.

I used a Photon to publish 250 bytes of data to regular Event Hub every 10 seconds for 48 hours straight and here is what it cost me:

Basic Event Hubs = $1.08 Cents for 16,431 Particle Publish Events.

Stream Analytics = $2.23 Cents for 16,431 Particle Publish Events.

Total = $3.31

You can see the Stream Analytics cost more than double what the regular Event Hub service cost and that is why I'm trying to keep them to a minimum :wink:

Either way, I'm glad to have you around @bjagpal since it's nice to be able to talk with somebody who is using this service also :smiley:

ā€œThe biggest question I have is if this new beta streaming tile setup requires you to setup and constantly run a separate Stream Analytics function for it to work? As you know, the Streaming Analytics cost add up quickly.ā€

  • Yes, the new streaming tile integration that I am talking about is designed specifically for use with an Azure Stream Analytic (ASA). This is similar to how there is a streaming tile integration designed specifically for PubNub.

ā€œWhat do you mean when you say almost-realtime? Like it shows up within 1 min of the data being sent, or a few mins, or 15 mins?ā€

  • In terms of ā€œalmost-realtimeā€, my dashboard seems to update every 10 minutes and this is in line with how often my photons are sending information to Azure. Unfortunately, Iā€™m not sure what (if any), lag there would be if you are sedning data at a much higher frequency.

ā€œSo all the data you have coming into Power BI is being sent in via an Azure Stream Analytics function? Iā€™m trying to keep the Stream Analytic functions to a minimum to keep cost down since I plan on having hundreds of products pushing data in at 1 min intervals. If I can push the live data into the dashboard using the new beta, live dataset function which does not require a separate Stream Analytics function, and the cost that comes with running it 24/7 then monthly expenses to keep this alive will be lower by a good chunk.ā€

  • Yes, I am feeding all the data to Power BI through an ASA. The beta function I am talking about (ā€œstreaming datasetā€) is just a different way to use Azure to send data to Power BI. And with it, comes the streaming tile function.
    I believe you are looking for a way to send data to Power BI directly from the Particle Cloud (bypassing Azure altogether); that is very different and I think thatā€™s where things are being confused.

ā€œYou can see the Stream Analytics cost more than double what the regular Event Hub service cost and that is why Iā€™m trying to keep them to a minimumā€

-Donā€™t wanā€™t to get into the nitty gritty in terms of pricing, but generally from what I have found its about $1.00/day per event hub, $1.00/day per ASA. So if you are using an event hub, and an ASA for storing data, and another ASA for sending data to Power BI, your daily cost will be approx $3.00/day. Depending on how much data you send, you can have many many devices hooked up, and still maintain this same cost. Youā€™ll have to refer to the Azure pricing calculator to see the details of how it all works.
As you mentioned, the IoT Hub is free if you plan on sending less than 8,000 messages a day, so you can save the $1.00/day associated with the event hub. I have yet to set up the IoT hub so weā€™ll seeā€¦

The PubNub integration does not require you to run or pay for a Stream Analytics function and that is why I'm interested in a similar setup for Particle Publish data that is pushed into a Azure IoT Event Hub.

Thanks for your other replies :thumbsup:

Let us know what you think about the Azure IoT Hub integration when you get a chance to set it up, it's much easier to work with.

@jeiden @zach @bjagpal I am talking with Sirui Sun who is with the Microsoft Power BI team.

I asked her about getting data from a Particle Publish to Azure IoT Hub to be usable as a Streaming Dataset in Power BI so the data updates in real time as soon as Azure IoT Hub receives the Particle Publish data. This is something they have already done with PubNub.

Here is the reply I received from Microsoft regarding doing this with Particle Publish events. Can you look it over and reply please. Without this data updates only happen every 15-10 mins.

The easiest way to get data into Power BI streaming dataset right now is to call our public REST APIs. That is actually what Azure Stream Analytics does today.

Roughly speaking, the steps would be to use the REST API to create a new streaming dataset (defaultMode:pushStreaming flag), and then use the REST APIs to start pushing rows of data into
that dataset.

Could Particle make those REST calls to Power BI?

Hey @RWB,

You can try to work with Power BIā€™s REST API (https://msdn.microsoft.com/en-us/library/mt147898.aspx) and create a Particle webhook to bypass Azure and go straight to your desired destination. Realistically, we are not planning on building a 1st class integration with Power BI anytime in the near future.

Thanks,

Jeff

2 Likes

@jeiden I now have the Azure IoT Hub incoming data being stored in a Azure Table Database after tweaking some settings that I used to do the same thing with regular Azure Event Hubā€™s.

Here is a picture of the Azure Storage Explorer PC application showing the data created for every Particle Publish event we sent to Azure IoT Hub. You can see all the data we are passing in is under the ā€œdataā€ column.

The Azure IoT Hub integrationā€™s webhook is setup so the data is sent formatted like this:

If I wanted add separate ā€œdata1ā€: , ā€œdata2ā€: , ā€œdata3ā€: , to the Particle Publish event the same way we can create custom JSON Response Templates when creating Webhooks, is it possible with the Azure IoT Hub Integration also?

If I added these extra ā€œdata1ā€: lines under the current ā€œdataā€: ā€œ70ā€, line then the info we send under data1 will be stored in a separate column in the Azure Table Database.

Iā€™m not sure if Iā€™m clear enough but let me know if not and Iā€™ll say it another way.

I believe I have the same question. How can we send the data to the IoT Hub so that each data element gets itā€™s own column when pushed into an Azure storage table? (This isnā€™t just necessary for table storage, but for any stream analytic we implement).

Here is my firmware

float f = Temp1;
float g = Temp3;
char payload[255];

snprintf(payload, sizeof(payload), ā€œ{ ā€œOutletā€: %f, ā€œInletā€: %f }ā€, g, f);
Particle.publish(ā€œSendLoonIoTā€, payload, PRIVATE);

And here is the resulting storage table:

@bjagpal @jeiden

We just need the ability to add our own custom JSON formatting to the data that is sent over to Azure. Right now the Azure IoT hub integration just gives us the ā€œdataā€: field to send data over to the Azure IoT Hub and that whole field getā€™s entered into database as one column. We need to be able to customize the formatting of the data being sent to Azure IoT so we can have separate columns for each separate data variable we send over.

Here is how they formatted the webhook JSON data using regular Event Hubs in the Azure Weather Shield Example. This created separate columns for Subject, Unit of Measure, Measure Name, Value, Organization, Display Name, Location, Timecreated, and guid.

{

"event": "ConnectTheDots",

"url": "https://connectthedotsex-ns.servicebus.windows.net/ehdevices/messages",

"requestType": "POST",

"json": {

"subject": "{{s}}",

"unitofmeasure": "{{u}}",

"measurename": "{{m}}",

"value": "{{v}}",

"organization": "{{o}}",

"displayname": "{{d}}",

"location": "{{l}}",

"timecreated": "{{SPARK_PUBLISHED_AT}}",

 "guid": "{{SPARK_CORE_ID}}"
},

"azure_sas_token": {

"key_name": "D1",

"key": "mBLQGWxSkRHg7f2eRCLonHUpNS+DY0iPHclxjf7Cmvk="

},

"mydevices": true

}

Here is the link to the Particle Weather Sheild to Azure Event Hub tutorial: https://www.hackster.io/pjdecarlo/hands-on-lab-particle-photon-weather-station-in-azure-d89a03

When I look at the columns being sent over to the Azure Table Database it includes the following data once in the database. You can see the table columns are named based on the JSON formatting of the data were sending to Azure IoT Hub.

I think we just need the ability to customize the JSON Template just like we can with webhooks.

@jeiden Does this make sense?

1 Like

Makes total sense, this is good feedback. Exposing the ability to manipulate the JSON sent using custom templating like webhooks would certainly be useful. Will add to the list of improvements to consider

@jeiden @Zach Not trying to come across as pushy or ungrateful but this customizeable JSON feature is critical to getting data into Azure in a format that is most useable by the other services like databases & dashboards.

I currently canā€™t find a way to separate the all the different data in the single Data colum so itā€™s split into separate fields so it can be used properly.

Iā€™m totally at your mercy when it comes to moving forward so let me know if you guys think this is something that could be added soon or not. If there is some sort of non elegant workaround for now Iā€™m down for that also.

Weā€™re so close to having a perfect Microsoft Azure IOT ingestion solution! This is what Iā€™m going to build my products professional backend on.

Once the customizable JSON is ready I will put together a video showing how to setup the Streaming Analytics and database storage so you can archive all incoming data as desired.

Let me know if there is anything I can help with.

@jeiden @bjagpal @BenVollmer

I had an idea last night that may allow us to separate the data in the single Data payload into separate columns of data and then push that data to Azure Databases or Power BI.

Iā€™m wondering if we can just add code to the Azure Stream Analytics function to separate the incoming Data Payload sent from the Particle Publish and parse the data into their own data columns so they can be pushed into Azure Databases & Power BI as needed.

I have no idea how to code this but Iā€™m pretty sure we can do it.

Here is info on the Azure Stream Analytics Query Language: https://msdn.microsoft.com/en-us/library/azure/dn834998.aspx

If we could get a working template to do this then we would not need to make any changes to the current Particle Azure IoT Hub integration which Iā€™m sure @jeiden wouldnā€™t mind :smiley:

1 Like

This is an interesting idea. But Iā€™m wondering if the data payload doesnā€™t need to be ā€˜broken downā€™ prior to its arrival at Azure. Isnā€™t this what the webhook from the weather shield example did? I have written some Stream Analytics to do parsing, but Iā€™m far from being an expert on writing querys. Maybe it can be done.

Based on what I read about Steam Analytics, the more complicated the query, the greater the ā€˜computeā€™ power it uses which ultimately can affect how much that ASA will cost you. I have no idea, however, whether simple parsing like this would even count towards this compute metric. So far all of my querys have cost me nothing except the flat hourly rate associated with the ASA.

Ha great @RWB,

Thatā€™s great to hear. I do think your custom JSON template is on my short list to add to the beta. It would be very valuable to provide this flexibility. Stay tuned for updates, but you should definitely try to unblock yourself without it.

Thanks,

Jeff

1 Like

@bjagpal The more I look at the Streaming Analytics Query examples the more confident I get that we should be able to do this easily once we have a template/ example to work with.

Do you mind sharing your parsing example code?

Iā€™m just going to have to hit the search engines until I find somebody else who has done something similar if I canā€™t figure it out after looking at all the examples Azure provides.

For now see if this article provides what we need to strip out the data into separate data fields. https://blogs.msdn.microsoft.com/streamanalytics/2015/08/27/working-with-complex-data-types-in-azure-stream-analytics/

I also worry about increasing Stream Analytics cost by this extra processing but we will not know if it raises cost until we try it.

Iā€™m not sure it will help, but here is my ASA query for taking certain pieces of data, from a specific device (ā€œBoiler_955ā€), casting the values as integers, and then sending them to a Power BI output:

SELECT
CAST (timecreated AS datetime) as timecreated,
CAST (Inlet AS bigint) as Inlet,
CAST (Out AS bigint) as Out,
CAST (Ex AS bigint) as Ex,
CAST (Rm AS bigint) as Rm,
Name

INTO
toPowerBI

FROM
devicesinput TIMESTAMP by timecreated

WHERE
Name = ā€˜Boiler_955ā€™

This query works with the weather shield example, where the data payload was separated into individual elements using a webhook, prior to being pushed to Azure.

1 Like

Hey @RWB,

Iā€™m working on adding the ability to configure custom JSON when enabling the Azure IoT Hub integration. One thing that Iā€™m realizing is that our documentation on custom templating when using custom JSON with integrations is very thin. Currently, the only docs that exist are here: https://docs.particle.io/guide/tools-and-features/webhooks/#webhook-variables

It includes references to {{PARTICLE_DEVICE_ID}}, {{PARTICLE_EVENT_NAME}}, {{PARTICLE_EVENT_VALUE}}, and {{PARTICLE_PUBLISHED_AT}}, but does not instruct users on how they can actually use dynamic key/values sent in the event firmware publish as you mentioned in your example.

If you have some extra cycles, I think your expertise on this would be very useful for the community. The link to edit these docs is here.

1 Like

@jeiden Iā€™m just now slowly learning how to work with Webhooks and format them myself so Iā€™m not a expert by any means.

Once you have the custom JSON added to the Azure Integration what I will do is make a tutorial on how to use the custom JSON template to format the data that gets sent out when you trigger the Azure Integration and I think this also applies to regular webhooks.

Then you or we can add that to the Particle Docs you linked to where they apply and you can clean it up as needed if needed.

It sounds like your not to far off from having the customizable JSON feature added to Azure so Iā€™ll hold off on spending more time trying to figure this out via Azure Stream Analytic query functions which Iā€™m sure would be the most expensive solution to this problem.

Hey @RWB,

Wanted to update you that today we released the ability to customize the JSON that gets sent to IoT Hub. When creating the integration, click on ā€œAdvanced Settings,ā€ then put your desired JSON payload in the code editor. You can use all the normal templating variables discussed here. Happy building!

4 Likes

@jeiden Thank you and @particle2 very much for making this happen quickly!

Once I have this working with Azure > Stream Analytics > Table Database Storage Iā€™ll create a guide to show others how to do the same thing as promised.

Iā€™m excited this is all functional now :thumbsup:

Thatā€™s awesome @jeiden. This makes things WAY better.

1 Like