Microsoft Power BI + Particle = Awesome Custom Dashboards!

@bjagpal Hi, I also completed the tutorial and have got my data passing through Stream Analytics, confirmed by checking the ‘output’ blob storage.

However I can’t get the data to input in PowerBI - did you succeed in this ?

In PowerBI i can see the streaming dataset, but when I create a tile or report there is nothing there… I have four fields to use but cannot work out how to use them; displayname, measurename, timecreated, value.

Any help would be awesome. cheers

@Grillo That is a service that is still in preview at Microsoft so you will have to ask to have this function added to your Power BI Account to get it working.

Email this guy: Sirui Sun

If you push your data into a Azure Table Database then you will be able to access that database from Power BI now without problems.

@Grillo, as I mentioned a few posts above, there are two different ways to get data from Azure Stream Analytics (ASA) into Power BI (PBI). The first method is to simply setup PBI as an output on your ASA and just pass the data through. This is the method outlined at the following links:

1 -
2 -

The second method is to setup a ‘streaming dataset’, however, this function is in private preview right now so unless you signed up for that private preview, you shouldn’t actually see any ‘streaming datasets’, but you will see a dataset displayed below the “Datasets” heading.

I’m going to assume that you have setup PBI according to method 1 (although my proposed solution should work in both cases).

Now, when you click on the name of your dataset in PBI, it will display your four fields (displayname, measurename, timecreated, value) under the “Fields” heading. Beside your “value” field, you should see a sigma sign. This indicates that that field is an integer. If you do not see this sigma sign, then PBI interprets the data as a character and can’t really do anything with it.

To correct this, you need to explicitly type-cast your data from within your ASA query. Here is an example:

CAST (timecreated AS datetime) as timecreated,
CAST (value AS bigint) as value,
IoTinput TIMESTAMP by timecreated

So what I’m doing here is casting my “timecreated” field as a datetime type, casting my “value” field as an integer, and passing through the “displayname” and “measurename” fields without any casting since they are of the character type and can stay that way.
“toPowerBI” is the name of my ASA output, and “IoTinput” is the name of my ASA input.

So in summary, I think you just need to adjust your ASA query. Let me know if this helps.

@bjagpal FYI - Dave announced that they are working on changing the way we can format webhooks shortly which will allow us to send data via a webhook directly to a webhook tile without the need to pay for or rely upon Azure Stream Analytics.

This can save people some money since Stream Analytics cost can get high depending on what you’re doing.

Interesting. I presume this has to do with sending data directly to PBI using its API? I remember you were looking into that…cool.

Yea, direct from webhook to Power BI using the API endpoint so no Stream Analytics needed. :smiley:

Thanks @RWB !

Actually I did sign up for the preview so can access the steaming dataset in PBI.

I think the problem is I am not proficient enough to do the query. Am following @bjagpal instructions below. Will post results.


And it works…! thanks v much ! got the data plotting out perfectly with your instructions.

Sweet! Post a screen shot of your custom dashboard when your finished creating it :smiley:

Also post your query if you can just so we have 2 examples to work with.

Here is a screenshot of my PBI phone app:
<img src="//" width=“281” height=“500”>

In the report I can easily create different tiles by using measurename as a filter, but in the dashboard I don’t have this option. This is why I am grouping the 3 sensors on the tile. Id welcome any suggestions!

For the query I just used, word for word, the query from @bjagpal:
CAST (timecreated AS datetime) as timecreated,
CAST (value AS bigint) as value,
IoTinput TIMESTAMP by timecreated

As I followed the Connectthedots tutorial it works perfectly. The only change I had to make was to add a custom JSON to the new IOTHUB webhook in particle cloud:

“subject”: “{{s}}”,
“unitofmeasure”: “{{u}}”,
“measurename”: “{{m}}”,
“value”: “{{v}}”,
“organization”: “{{o}}”,
“displayname”: “{{d}}”,
“location”: “{{l}}”,
“timecreated”: “{{SPARK_PUBLISHED_AT}}”,
“guid”: “{{SPARK_CORE_ID}}”


I would take a look at all the custom visualizations that are available for download and use with Power BI desktop.

They offer plenty of video tutorials on their blog and forum so be sure to take some time to search through it all if you really want to get fancy.

Hi @RWB - please can you share the code you used to get Photon to send data to PubNub ?

I can get the data into PowerBI through IOTHUB/Stream Analytics, but the query I used (from @bjagpal) doesn’t allow me to have individual tiles (‘value’ is shared between different sensors).


just for some background info, I tried using the new “streaming dataset” function that is currently in private preview and after playing around with it a bit, I got rid of it and went back to the original non-streaming dataset. There were several reasons for this: the streaming tiles are only able to show the last hour of trending. You are also not able to set up alerts for the values shown on the streaming tiles.

Anyways, the query that I gave to you should be fine. Rather than creating streaming tiles to show your data, you may want to consider creating reports, and then pinning them to your dashboard. I’m not sure if you are familiar with making reports in Power BI, but when you do, you can filter the incoming data packets using another field, ie. “displayname” or “measurename”. So even though the value field is common to all of your data packets, you can distinguish between them by using another field, and route them to their own reports.

There is a tutorial out there somewhere that shows how to setup simple reports. If you haven’t already experimented with reports, I think you will find that they are much more powerful and more flexible than the simple streaming tiles.

1 Like

Thanks @bjagpal !

Actually, I followed the recent posts here How to pass unescaped JSON to Azure IoTHub? and modified the custom JSON in the webhook integration. Now I can pass each value individually to SA. All working !

thanks for help

1 Like

@bjagpal @grillo

I followed the tutorial

and then

to get the data on powerBI. I have sensor readings,schedules and notification that i would want to display on the powerBi dashboard. I figured out how to add custom fields and show them on powerBI by customizing the json in the integration setup. I have 3 different event names at present with their own custom json. I can see the publish events on the iot-hub explorer utility but i can only see one of those events posting into the powerBI dataset. Any ideas on what am i missing out?

1 Like

Its been a while since I attempted to use the Azure IOT integration, it was quite hard work as I recall as the platform is immense with multiple ways to skin every cat. Do you need to create a stream analytics job for each event or something?

I know its quite tedious but its for a client who wants to explicitly work with powerBI. It sounds odd to create a new job for every event since i did not put in the name of the event that shows correctly.

I quit using Power Bi after finding due to how much easier it was for me to work with. Power BI was difficult to work with but very flexible.

It is pretty complicated but i have no choice.

1 Like