Microsoft Power BI + Particle = Awesome Custom Dashboards!

In my search for a highly customizable dashboard for displaying data harvested from connected Particle Photon & Electron devices I have come across many different solutions but none of them provide the flexibility Microsoft Power BI does :smile:

Thanks to @gusgonnet for bringing my attention to this service :smiley:

You can pull in data from tons of different sources and use that data in your custom dashboards specific to your particular needs. There are over 60 different platforms you can currently pull data from and this list is growing quickly since Microsoft has a team of over 500 employees working on Power BI.

Here is a list of services that you can quickly pull data from so that you know:

Now another huge plus from my point of view is that you or your clients can quickly see the custom dashboards you create on any of the mobile platforms by just downloading the iOS, Android, or Windows Mobile app and then logging in to their account. You do not need to create a custom iOS, Android, or Windows Mobile app or deal with keeping it up to date which is a major time consumer.

Once you create a dashboard, you can easily share that dashboard online via a web page or blog post using their simple share button which creates an iFrame code that you can embed. That is very nice! You can then track how that link is begin used via their built-in analytics.

Today Gustavo posted a link from PubNubā€™s latest Blog post that announced that they are working with Microsoft to make it easy to get data that is in a PubNub account to show up in the Power BI dashboard so you can use and view it.

That blog post is here: https://www.pubnub.com/blog/2016-08-11-microsoft-power-bi-integrates-pubnub-for-realtime-data-visualization/

In that article, there is a link to a tutorial from Microsoft showing how to add a PubNub data stream to Microsoft Power BI so you can display it however you want within a Power BI dashboard.

Here is the link to the tutorial that Microsoft BI created:

Now in this tutorial I learned that you could use the Power BI REST API to capture and log streaming data sent by an IOT device. Below is a snippet directly from this tutorial about getting data into Power BI.

So I think this is great since it sounds like we should be able to send data directly from the Particle Photon to Microsoft Power BI by using their API functions. Then Power BI will display that data live and also store that sensor data into a database so you can see historical data also.

The Microsoft Power BI, REST API documentation, is here: http://docs.powerbi.apiary.io/#reference/datasets

Iā€™m clueless on how to work with REST APIā€™s, but Iā€™m posting all this in case somebody smarter than me is interested in exploring this and creating some code that works :smiley:

I know that we can send data from the Photon to the PubNub service and get this working, but since Power BI can accept data directly from a Photon if I understand everything correctly then it is probably the most efficient way to get data over to Power BI. Let me know if Iā€™m off base here or not.

I shared this because Iā€™m excited about being able to create beautiful custom dashboards for my products exactly how I have been envisioning them.

I donā€™t know if pushing data directly into an Azure database is better or cheaper, but I would like to explore driving data directly into Power BI using the REST API if others are willing to help figure out how to do that.

I would also love to hear what @Dave @zach @BDub @will @zachary think about the Power BI platform and if itā€™s something Particle would be interested in supporting via a Plugin or something.

Thanks for your time!

Hey, I took a look at that API, and I think you cannot post data using it. You can manage your dashboards and lists and other things but no data.
Have you found this?

Else, the way to go here might just be Particle -> pubNub -> powerBI dashoard.

Gustavo.

Thanks for taking a look at it.

I now have a custom Power BI dashboard setup displaying data from the PubNub demo account that they provide in the Microsoft PubNub setup tutorial and that was super simple to do after you sign up for the free Power BI account.

Then I used the Particle + PubNub tutorial I linked to in the AWS thread to get a Photon sending data to PubNub successfully so that works also and is enough to get you up and running which is nice.

It seems that Power BI is just subscribing to publish events from PubNub and displaying them and or saving them as they come in also. I see no reason why they cant do the same for the Particle devices also.

Iā€™m going to try to get somebody from Microsoft Power BI on here and see if they would be interested in providing a solution for the Particle ecosystem.

Iā€™m impressed with the dashboard and what I have been able to do with it and itā€™s features. It looks great on my phone and desktop. I like that in landscape mode on your phone the desktop looks exactly like it does on the laptop with no differences.

It feels like PubNubā€™s publish and subscribe methods are almost identical to what Particles cloud does and we really do not need PubNub in the middle but Iā€™m not 100% sure on that.

PubNub does have some extra services that Particle does not like data logging and other action services so maybe it is a good service for a IOT product.

Any thoughts on this?

Here is the tutorial I used to get a Photon to send data to PubNub:

Here is the dashboard I created so far after following the Microsoft tutorial that is linked here:

Here is a screen shot of a simple dashboard:

There is a free Power BI Windows application you use to create custom visualization templates they offer online, but I have not figured out how to get access to the PubNub stream from the desktop application yet.

The tiles that are available on the web based are limited to 5 different options, with the rest of them being accessible via the desktop Power BI application which pushes them to the web-based dashboard after you create them and then pin them to whatever dashboard you choose.

1 Like

@gusgonnet I found the documentation that shows you how to setup an Azure EventHubs to capture data sent by a Photon or Electron webhook and then place that data into an Azure Database.

Itā€™s part of this weather station tutorial which is doing a lot of different things that are not required if all your wanting to do is push data into an Azure Database.

They are also using a service called Stream Analytics to read the incoming data, and then trigger actions based on your desires. This Stream Analytics is a service that cost money, so if you do not want to use it then, you can just skip setting it up as described in the tutorial.

So the main loop code they use for the Photon weather shield project is in the link below so you can look at that code and see exactly how they are sending the webhook data.

There is a video that shows how to setup the Event Hub + Azure Database + Event Stream Analytic services which should be helpful.

The video looks like this, and itā€™s half way down the webpage at the link above.

And here is the step by step setup guide that they are following in the video. Iā€™m just focusing on getting the data into a database personally. :

Once the data is being dropped into an Azure database, you will be able to pull it into Power BI and create beautiful dashboards using the Power BI desktop design software and then push it to your online dashboard.

Iā€™m going to give it a shot and see how it goes :smile: Hopefully all goes well.

If anybody else triesā€™s this out, please do let us know how it works out :wink:

1 Like

great recompilation of information!
Good luck on making this work :smile:
Gustavo.

1 Like

This looks really interesting, and would be a great candidate for an ā€œofficialā€ Particle integration. Please post your progress, @RWB!

1 Like

@gusgonnet I figured out how to import Google Sheets into the Power BI platform using Power BI Desktop.

I just tried this, and it works.

I also looked up the Google Sheet limitations as far as size goes which is not that big but could work fine for some applications, see below:

Here is instructions on how to add a Google Sheet to Power BI:

And then:

With the free Power BI account, the data is updated from the Google Sheet every day. With the Power BI Pro account, you can set it to update the Google Sheet data every hour. So this is a good way to add another data source to include in your overall dashboard of relevant info.

Iā€™ll keep adding more info here as I learn more.

1 Like

Hi there, I believe I have a similar setup to yours, and I am now also experimenting with the Power BI software. Specifically, I am sending data from my Photon -> Azure Event Hub -> Azure Table Storage.

Using the Power BI Desktop application, it is easy to pull data from my Azure Table Storage account and create nice dashboards. These dashboards can then be published to the Power BI Web application where they are displayed (within your own account). Unfortunately these dashboards are ā€œfrozen in timeā€, or in other words they do not refresh automatically so they donā€™t provide real-time info about your device.

I am interested in having access to real time data and there is a way to use Azure Stream Analytics to send data from the Event Hub directly to a Power BI account. The instructions are found at this link, (which was initially located within that Hackster Weathershield project):

I was able to complete the tutorial above, and it does provide auto-refreshing real-time data. However, although I was able to show the data in a regular chart format, I couldnā€™t get the data into a graph format as shown in the tutorial. I think itā€™s because the temperature data is being interpreted as a string rather than a number.

Using the Power BI Desktop application, you can easily change the data type from a string to a number. But the Power BI Web application (the one required to take the streaming data from Azure) doesnā€™t seem to allow for this data type conversion.

Can anyone shed some more light on using Power BI in this manner?

@bjagpal You have gotten further than me at this point which is good and I would also like to know how to accomplish the live data graphing your looking to accomplish also.

It looks like you setup a different stream analytics job to push the incoming data from a Photon > Stream Analytics > Power BI Live Dashboard tile.

PubNub has this live data feature added in and Iā€™m hoping maybe these links about that setup may help with your search for a solution for showing live data in a graph format.

See if anything useful is in these links:

https://github.com/Microsoft/PowerBI-CSharp/tree/master/samples/consoleapp/getting-started-for-dotnet-real-time

Yes, I had to set up a separate stream analytics job as outlined in that tutorial. I also came across these two links which basically outline the same process:

1 - https://azure.microsoft.com/en-us/documentation/articles/stream-analytics-power-bi-dashboard/
2 - http://radacad.com/stream-analytics-and-power-bi-join-forces-to-real-time-dashboard

I found these links on a Power BI community forum that was discussing this very topic. Essentially there are two ways that Power BI can ingest streaming data from Azure. The first way is as Iā€™ve done it, using a stream analytics service. The second (better) way is to set up a ā€œstreaming datasetā€ from within Power BI. This is basically an integration on par with the Pubnub integration already present. It isnā€™t quite ready yet, but apparently they are working on it.

Hereā€™s the a link to that discussion:

1 Like

@bjagpal Iā€™m glad you posted that up on their forum so hopefully, we can get some help or a time frame for when this feature will be added.

The only way around it right now that I can think of is to push the data to PubNub also and then displays that live data with a max window of 15 mins, and then send that same data to Azure table storage where it will update all stored data every 15 mins. This way you have a window with the live data stream back 15 mins and all historical data will be in another window where Power BI can be set to update/refresh every 15 mins.

Make sense? If youā€™re using the Electron then sending to PubNub would cost extra since you would have to push that data to their source also to get the live updates to happen. Sucks but hopefully they get it worked out SOON :smiley:

@bjagpal Hi, I also completed the tutorial and have got my data passing through Stream Analytics, confirmed by checking the ā€˜outputā€™ blob storage.

However I canā€™t get the data to input in PowerBI - did you succeed in this ?

In PowerBI i can see the streaming dataset, but when I create a tile or report there is nothing thereā€¦ I have four fields to use but cannot work out how to use them; displayname, measurename, timecreated, value.

Any help would be awesome. cheers

@Grillo That is a service that is still in preview at Microsoft so you will have to ask to have this function added to your Power BI Account to get it working.

Email this guy: Sirui Sun sirsu@microsoft.com

If you push your data into a Azure Table Database then you will be able to access that database from Power BI now without problems.

@Grillo, as I mentioned a few posts above, there are two different ways to get data from Azure Stream Analytics (ASA) into Power BI (PBI). The first method is to simply setup PBI as an output on your ASA and just pass the data through. This is the method outlined at the following links:

1 - https://azure.microsoft.com/en-us/documentation/articles/stream-analytics-power-bi-dashboard/2
2 - http://radacad.com/stream-analytics-and-power-bi-join-forces-to-real-time-dashboard1

The second method is to setup a ā€˜streaming datasetā€™, however, this function is in private preview right now so unless you signed up for that private preview, you shouldnā€™t actually see any ā€˜streaming datasetsā€™, but you will see a dataset displayed below the ā€œDatasetsā€ heading.

Iā€™m going to assume that you have setup PBI according to method 1 (although my proposed solution should work in both cases).

Now, when you click on the name of your dataset in PBI, it will display your four fields (displayname, measurename, timecreated, value) under the ā€œFieldsā€ heading. Beside your ā€œvalueā€ field, you should see a sigma sign. This indicates that that field is an integer. If you do not see this sigma sign, then PBI interprets the data as a character and canā€™t really do anything with it.

To correct this, you need to explicitly type-cast your data from within your ASA query. Here is an example:

Select
CAST (timecreated AS datetime) as timecreated,
CAST (value AS bigint) as value,
displayname
measurename
INTO
toPowerBI
FROM
IoTinput TIMESTAMP by timecreated

So what Iā€™m doing here is casting my ā€œtimecreatedā€ field as a datetime type, casting my ā€œvalueā€ field as an integer, and passing through the ā€œdisplaynameā€ and ā€œmeasurenameā€ fields without any casting since they are of the character type and can stay that way.
ā€œtoPowerBIā€ is the name of my ASA output, and ā€œIoTinputā€ is the name of my ASA input.

So in summary, I think you just need to adjust your ASA query. Let me know if this helps.

@bjagpal FYI - Dave announced that they are working on changing the way we can format webhooks shortly which will allow us to send data via a webhook directly to a webhook tile without the need to pay for or rely upon Azure Stream Analytics.

This can save people some money since Stream Analytics cost can get high depending on what youā€™re doing.

Interesting. I presume this has to do with sending data directly to PBI using its API? I remember you were looking into thatā€¦cool.

Yea, direct from webhook to Power BI using the API endpoint so no Stream Analytics needed. :smiley:

Thanks @RWB !

Actually I did sign up for the preview so can access the steaming dataset in PBI.

I think the problem is I am not proficient enough to do the query. Am following @bjagpal instructions below. Will post results.

cheers

And it worksā€¦! thanks v much ! got the data plotting out perfectly with your instructions.