How to pass unescaped JSON to Azure IoTHub?

Actually, I’m trying to pass the Sparkfun Weather shield data into Azure, but I removed some fields for clarification.

I am aware of the SQL-like of StreamAnalytics. The * is more like an ‘catch all’ solution.

The fact that the JSON is not well-formed, would make the Photon a bad choice for more mature solutions; both in StreamAnalytics (eg. grouping on certain fields) and the IoTHub (eg. routing (in preview)).

Do you get separate table columns for all fields in your message or a single column named Data (filled with the escaped JSON string)?

@Dave Can help clarify on the JSON string issues.

Yes, I get separate table columns placed into the Azure Table storage and here is what my code and JSON Template look like to do that:

Here is the code I’m testing on a Photon:


char Org[] = "My_ORGANIZATION";
char Disp[] = "Particle Photon";
char Locn[] = "Fishers, IN";

const char* cityLocation = "Fishers"; //City for my Photon
const char* stateLocation = "IN"; // State for my Photon

int SOC = 95;
float Voltage = 12.80;
float Current = 5.25;
float TTG = 35.4;
float Temperature = 81.3;
float SolarVin = 24.2;
float SolarVmp = 19.2;
float SolarW = 56.2;


// The on-board LED
int led = D7;

void setup() {
  pinMode(led, OUTPUT);
}
void loop() {




  char payload[255];
  // Turn the LED ON
 digitalWrite(led, HIGH);

  snprintf(payload, sizeof(payload), "{ \"s\": %d, \"v\": %.2f, \"i\": %.2f, \"t\": %.2f,\"f\": %.2f, \"s\": %.2f, \"m\": %.2f, \"w\": %.2f,\"l\":\"%s\" }", SOC, Voltage, Current, TTG, Temperature, SolarVin, SolarVmp, SolarW, Locn);
   //Serial.println(payload);
  //   Spark.publish("AzureElectron", payload, NO_ACK);   //The NO_ACK is currently only working with RC 0.6.0 rc1 firmware. This is just a test. For Electron
   Particle.publish("Azure_IOT_Almost", payload, PRIVATE);


  // Turn the LED off
  digitalWrite(led, LOW);
  delay(10000);
}

And here is how I have the Azure IoT Hub setup with custom JSON formatting to get the data fields into separate columns in Azure Table Storage:

Then using this Azure Stream Analytics Query:

I get data in separate columns in the Azure Table Database:

Does that help you any?

2 Likes

It works now, thanks.
The additional JSON data did the trick. I was in the assumption that the JSON I send with

Particle.publish("azureiothub ", payload);

was the same JSON that would arrive at the IoTHub. Your example above showed me how to transform the data first.

I have three observations:

-1 it seems I can not change the addition JSON data once it is save. I have to drop the Integration first and build a new one.
-2 The values in the additional JSON data must be quotes to. Why is this needed:

Now I have to cast my data in my StreamAnalytics:

select (Cast(tempf as float)-32)/1.8 as temperature into S from I

Thanks for your help again

@svelde

Yes, that is a pain in the ass isn’t it :smile:

That’s a good question for @Dave

What do you mean by that? Why can’t you just format the temp to F on the Photon or Electron before sending over to Azure IoT Hub? That’s what I do.

What do you mean by that? Why can’t you just format the temp to F on the Photon or Electron before sending over to Azure IoT Hub? That’s what I do.

Well, this is in fact the Fahrenheit -> Celcius calculation :smile:

Doing the calculation on every photon/device or in the (StreamAnalytics) backend is a matter of taste. I choose to do as much as possible in the backend so on changes, I do not have to redeploy every time. But in this case, a calculation of degrees could be done on the device too, this will not change that many times.

But the real issue is the (unnecessary) string to float casting due to the quotes of the JSON value.

I prefer to do it on the device since its free vs. doing the conversion via a Stream Analytics Query since that cost money.

From my testing, the Stream Analytics service cost adds up quicker than anything else when sending data frequently.

I prefer to do it on the device since its free vs. doing the conversion via a Stream Analytics Query since that cost money.

Yes, I understand. For testing purposes and personal usage (just a few devices) I want to cheapest solution too. But in production, this is hard to manage.

From my testing, the Stream Analytics service cost adds up quicker than anything else when sending data frequently.

A free tier Stream Analytics job, just like the free IoT Hub, would be more than welcome :wink:

Certainly!

I’m planning on scaling to 1,000’s of devices so I have to keep an eye on all the small cost because they really add up when you have that many devices running at the same time.

1 Like

Hi @svelde,

You can send any kind of data to any kind of service using the Photon and Particle Cloud. Webhooks are a convenience to help make that easier, but you certainly don’t need to use that if you need to send something special.

I’ve been thinking about the json template issue that prevents sending native types not in quotes, and I have a good solution to this issue. My hope is that the team can address this in the coming months when they have a chance.

Thanks,
David

4 Likes

Thanks, all this was really helpful. @Dave the non-editable integrations is super-silly. Please fix this! Similarly as @svelde pointed out, proper JSON support so you do not need to quote numeric values would be nice.

1 Like

Hi @jrowe88,

Glad it was helpful! I agree, we’re working on fixing both of these, hopefully sometime in the coming sprints.

Thanks!
David

1 Like

Hi @Dave.

Do you know if Particle team could correct the problem of passing numerical values ​​in the JSON template?

Thanks to @svelde for start the thread and thanks to @RWB for put the example

Regards from Chile, SouthAmerica

1 Like

Hi @rmunoz,

Thanks for asking! I know this has been on our roadmap for a long time, and several fixes have been proposed internally. I don’t know when it’ll get priority yet, but the team is aware of it.

Thanks,
David

Hello, sorry for bringing this up again, but the image for the Azure Stream Analytics Query seems
to be gone. If possible, can you repost it?

I no longer have that image or remember what it was unfortunately.

I ditched Azure for Losant.

I fixed it. The setup was already made in Azure, but thanks for the reply :slight_smile:

1 Like

Can this screen be reposted?

I am still having this issue has there been a solution?

So with ordinary webhooks you can enter data like this. The line is flagged as an error, but you can still save the webhook, and it works.

However, with the Azure integration it appears that you cannot save a webhook with JSON that does not currently parse, but might parse correctly after the mustache template expansion.

A bug report was created for this to be investigated by engineering as it seems like it should work the same as regular webhooks.

We removed the check for JSON that does not parse at the time you enter it for Azure webhooks so if you click to the Custom JSON tab you can now enter this and save it:

{
  "event": "{{{PARTICLE_EVENT_NAME}}}",
  "data": {{{PARTICLE_EVENT_VALUE}}},
  "device_id": "{{{PARTICLE_DEVICE_ID}}}",
  "published_at": "{{{PARTICLE_PUBLISHED_AT}}}"
}

Note that the value for data is no longer surrounded by double quotes, so it’s flagged as an error. However, when the Mustache template is expanded and the event data contains valid JSON, it will send the data up as a JSON object instead of a string. Also make sure you use triple curly brackets {{{PARTICLE_EVENT_VALUE}}} to prevent HTML escaping in Mustache.

This can also be used so send up true numbers, instead of requiring numbers encoded as strings.

2 Likes