Azure Integration - Getting started

Hi Guys 'n Gals,
This is all pretty new to me but I’m currently working on an interesting little project using the Photon to control a device and also feedback multiple sensor readings to the cloud for remote monitoring, diagnostics, perhaps some control and as yet other undetermined uses which will no doubt come to us when we start to build up a better picture from that data.
I’m pretty happy with what I can achieve on the Photon itself, however the cloudy side is a little more mysterious, I initially experimented with Thingspeak as that was a fairly quick and easy way to get a unit talking to the cloud and showing me and the client what the unit was up to. However the endgame is to get this connected to Azure and with the release of the new beta Azure Integration this seems an excellent time to get stuck in.
With Thingspeak I could publish up to 8 values all at once against a single “Event”. How do you something similar with the Azure Integration? I’m assuming it’s done in the JSON section under advanced but I’m not entirely sure how I proceed and what I might need to setup on the Azure end. Pointers to documentation that explains this, snippets or any other help gratefully received.

How many of these threads have you been skimming over so far :wink:
https://community.particle.io/search?q=azure

2 Likes

OK OK over-skimming, sorry!, This one How to pass unescaped JSON to Azure IoTHub? looks like it might address my question.

This is all pretty recent so quite a lot of Azure related questions reference other solutions like connectthedots and Pubnub, whatever that is.

If we look at a scenario where you have a finished product, you’ve made say 1000 of them and this 1000 units are divided between say 5 clients.How would you best approach linking these clients with their devices and their Data streamed to Azure. What authentication would people recommend, Where would the user best be defined and how would you go about giving them views of their data and possible control of the device. I’m aware this is a pretty wide ranging question and I do envisage employing commercial support from Particle/MS/3rd parties when we get to it but if there are some worked examples out there I’m missing that demonstrate all this put together as a finished thing, that would at least help me understand the ecosystem a little better. I’ve read this https://docs.particle.io/guide/how-to-build-a-product/ for instance but extending that to PowerBI/Azure/other and how to do that or indeed if those tools exist yet.

Cheers!

OK, lets put this another way, can anyone recommend a suitable partner who really digs Azure & Particle to advise and maybe even lay groundwork. Ideally they would be UK based or better yet, Welsh.

Check out this overview:

Sorry, me again, I worked out I was doing something pretty silly when trying to work all that out (JSON formatting & definition etc), is there a nicer way of building those strings without having to wade through all the escape characters etc which personally I find really hard to read and debug if I mess things up?

However I have a more specific question.
At the most basic level you create an integration- connect to your Azure IOT hub and from there do as you please. Simple enough depending on what you are up to.
If you have multiple customers you would I assume create a ‘Resource Group’ for each customer on Azure & clone your developed dashboard(s) to that resource group (My specific use case has each Customer owning hundreds of units so this makes sense). However to keep each customers Data & running costs properly separate this means I also need to define the ‘Azure Integration’ at the customer level rather than Product level. I think this is/was something that Integrations were going to offer but does anyone know if they are at that stage currently or would I instead be creating a Product per customer?
Also is there a quick way of creating a test customer on the console/command line just so I can unlock whatever other menus appear at that point to understand it all a bit better?

You can use the SparkJSON library, but that may be somewhat memory heavy.

Man that is huge! possibly begging for lighter Azure specific lib
I had a random play with Ubidots the other day, I liked the way you just added key value pairs to the Ubidots instance and then sent them, without reading the libraries involved I’m going to assume it turned that into JSON behind the scenes but nice either way.

Bump to an old thread.

Here is a tutorial I wrote up about saving Photon data into Azure’s SQL Server database for visualizing in Tableau.

Part 1 - Overview & Architecture

Part 2 - Technical Details

Hopefully this helps you @Viscacha.
Gordon