Microsoft Power BI + Particle = Awesome Custom Dashboards!

Sweet! Post a screen shot of your custom dashboard when your finished creating it :smiley:

Also post your query if you can just so we have 2 examples to work with.

Here is a screenshot of my PBI phone app:
<img src="//discourse-cloud-file-uploads.s3.dualstack.us-west-2.amazonaws.com/business7/uploads/particle/original/2X/c/c4e11953525386ac01ad12d87b5c09a8b302996d.png" width=“281” height=“500”>

In the report I can easily create different tiles by using measurename as a filter, but in the dashboard I don’t have this option. This is why I am grouping the 3 sensors on the tile. Id welcome any suggestions!

For the query I just used, word for word, the query from @bjagpal:
Select
CAST (timecreated AS datetime) as timecreated,
CAST (value AS bigint) as value,
displayname
measurename
INTO
toPowerBI
FROM
IoTinput TIMESTAMP by timecreated

As I followed the Connectthedots tutorial it works perfectly. The only change I had to make was to add a custom JSON to the new IOTHUB webhook in particle cloud:

{
“subject”: “{{s}}”,
“unitofmeasure”: “{{u}}”,
“measurename”: “{{m}}”,
“value”: “{{v}}”,
“organization”: “{{o}}”,
“displayname”: “{{d}}”,
“location”: “{{l}}”,
“timecreated”: “{{SPARK_PUBLISHED_AT}}”,
“guid”: “{{SPARK_CORE_ID}}”
}

Sweet!

I would take a look at all the custom visualizations that are available for download and use with Power BI desktop.

https://app.powerbi.com/visuals/

They offer plenty of video tutorials on their blog and forum so be sure to take some time to search through it all if you really want to get fancy.

https://powerbi.microsoft.com/en-us/custom-visuals/

Hi @RWB - please can you share the code you used to get Photon to send data to PubNub ?

I can get the data into PowerBI through IOTHUB/Stream Analytics, but the query I used (from @bjagpal) doesn’t allow me to have individual tiles (‘value’ is shared between different sensors).

Cheers

@grillo
just for some background info, I tried using the new “streaming dataset” function that is currently in private preview and after playing around with it a bit, I got rid of it and went back to the original non-streaming dataset. There were several reasons for this: the streaming tiles are only able to show the last hour of trending. You are also not able to set up alerts for the values shown on the streaming tiles.

Anyways, the query that I gave to you should be fine. Rather than creating streaming tiles to show your data, you may want to consider creating reports, and then pinning them to your dashboard. I’m not sure if you are familiar with making reports in Power BI, but when you do, you can filter the incoming data packets using another field, ie. “displayname” or “measurename”. So even though the value field is common to all of your data packets, you can distinguish between them by using another field, and route them to their own reports.

There is a tutorial out there somewhere that shows how to setup simple reports. If you haven’t already experimented with reports, I think you will find that they are much more powerful and more flexible than the simple streaming tiles.

1 Like

Thanks @bjagpal !

Actually, I followed the recent posts here How to pass unescaped JSON to Azure IoTHub? and modified the custom JSON in the webhook integration. Now I can pass each value individually to SA. All working !

thanks for help

1 Like

@bjagpal @grillo

I followed the tutorial
https://docs.particle.io/tutorials/integrations/azure-iot-hub/#example-use-cases

and then

to get the data on powerBI. I have sensor readings,schedules and notification that i would want to display on the powerBi dashboard. I figured out how to add custom fields and show them on powerBI by customizing the json in the integration setup. I have 3 different event names at present with their own custom json. I can see the publish events on the iot-hub explorer utility but i can only see one of those events posting into the powerBI dataset. Any ideas on what am i missing out?

1 Like

Its been a while since I attempted to use the Azure IOT integration, it was quite hard work as I recall as the platform is immense with multiple ways to skin every cat. Do you need to create a stream analytics job for each event or something?

I know its quite tedious but its for a client who wants to explicitly work with powerBI. It sounds odd to create a new job for every event since i did not put in the name of the event that shows correctly.

I quit using Power Bi after finding Losant.com due to how much easier it was for me to work with. Power BI was difficult to work with but very flexible.

It is pretty complicated but i have no choice.

1 Like