How to pass unescaped JSON to Azure IoTHub?

I try to pass JSON to my Azure IoTHub. The initial setup is quite easy. But that sends a string and I want to pass plain JSON.

I tried to use SparkJson

// Construct object
StaticJsonBuffer<200> jsonBuffer;
JsonObject& root = jsonBuffer.createObject();
root["tempF"] = tempf;
root["humidity"] = humidity;
root["baroTempF"] = baroTemp;
root["pascals"] = pascals;
root["altf"] = altf;

// Deserialize to JSON
char buffer[256];
root.printTo(buffer, sizeof(buffer));

// Trigger the integration
Particle.publish("temperature", buffer);

But all I got for :christmas_tree: was this escaped JSON:

{"data":"{"tempF":69.14,"humidity":49.51,"baroTempF":70.36,"pascals":102221.75,"altf":83843.30}","device_id":"3e0025000447343138333038","event":"temperature","published_at":"2016-12-26T00:38:23.9830000Z","EventProcessedUtcTime":"2016-12-26T00:38:22.8995203Z","PartitionId":1,"EventEnqueuedUtcTime":"2016-12-26T00:38:23.3160000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"3e0025000447343138333038","ConnectionDeviceGenerationId":"636181129351436517","EnqueuedTime":"0001-01-01T00:00:00.0000000","StreamId":null}}

(this is JSON output from StreamAnalytics (select * into blobsink from hubinput) which should contain unescaped json).

I also tried it using a simple string

Particle.publish("temperature", "{"value" : 42}",);

I have to escape the double quotes in order to have the code verified successfully. So It seems that the publish method is messing up my json? How can I pass unescaped JSON?

Check out this thread and see if we're having the same issue.

Hello RWB.

There are simularities regarding encoding strings for JSON. It seems that the nesting is the issue in my case: the outer json is ok, but the inner data json is messed up.

12/26/2016 11:38:59 PM> Device: [3e0025000447343138333038], Data:[{"data":"{ "t":68.562981, "h":42.400879 }","device_id":"3e0025000447343138333038","event":"temperature","published_at":"2016-12-26T22:38:59.707Z"}]

This is what the IoTHub Device Explorer shows when I try:

char payload[256];
snprintf(payload, sizeof(payload), "{ "t":%f, "h":%f }", tempf, humidity);
Particle.publish("temperature", payload);

btw. I checked the logging and it shows sound JSON... This confirms the encoding towards the IoTHub needs some love.

@svelde So you’re just trying to get the Temp and Humidity sent to Azure IoT Hub so the numbers are entered into Azure storage?

I’m sending this data over to Azure IoT Hub > Stream Analytics > Azure Table Storage using the Azure IoT Integration Webhook.

@RWB Well, I used an StreamAnalytics Job but the JSON seems to be rejected by the job (and for good reason, the JSON is not correct in my humble opinion).

So I checked the Device Explorer (I now see that I called the IoTHub explorer in my previous post, sorry), a great tool to listen in on the data sent to the IoT Hub. This is the first stop in Azure where I can check the incoming messages.

I am following the ‘invitation’ at http://blog.jongallant.com/2016/12/azureiotparticlebeta/index.html . Sending a string is going well but that is a poor design if you want to build a true IoT solution.

@svelde I can send numbers, floats, and strings to Azure IoT Hub > Stream Analytics > Table Database with no problems.

My Stream Analytics job code to pass the data directly into the Azure Table Database is in the screen shot below:

Actually, I’m trying to pass the Sparkfun Weather shield data into Azure, but I removed some fields for clarification.

I am aware of the SQL-like of StreamAnalytics. The * is more like an ‘catch all’ solution.

The fact that the JSON is not well-formed, would make the Photon a bad choice for more mature solutions; both in StreamAnalytics (eg. grouping on certain fields) and the IoTHub (eg. routing (in preview)).

Do you get separate table columns for all fields in your message or a single column named Data (filled with the escaped JSON string)?

@Dave Can help clarify on the JSON string issues.

Yes, I get separate table columns placed into the Azure Table storage and here is what my code and JSON Template look like to do that:

Here is the code I'm testing on a Photon:


char Org[] = "My_ORGANIZATION";
char Disp[] = "Particle Photon";
char Locn[] = "Fishers, IN";

const char* cityLocation = "Fishers"; //City for my Photon
const char* stateLocation = "IN"; // State for my Photon

int SOC = 95;
float Voltage = 12.80;
float Current = 5.25;
float TTG = 35.4;
float Temperature = 81.3;
float SolarVin = 24.2;
float SolarVmp = 19.2;
float SolarW = 56.2;


// The on-board LED
int led = D7;

void setup() {
  pinMode(led, OUTPUT);
}
void loop() {




  char payload[255];
  // Turn the LED ON
 digitalWrite(led, HIGH);

  snprintf(payload, sizeof(payload), "{ \"s\": %d, \"v\": %.2f, \"i\": %.2f, \"t\": %.2f,\"f\": %.2f, \"s\": %.2f, \"m\": %.2f, \"w\": %.2f,\"l\":\"%s\" }", SOC, Voltage, Current, TTG, Temperature, SolarVin, SolarVmp, SolarW, Locn);
   //Serial.println(payload);
  //   Spark.publish("AzureElectron", payload, NO_ACK);   //The NO_ACK is currently only working with RC 0.6.0 rc1 firmware. This is just a test. For Electron
   Particle.publish("Azure_IOT_Almost", payload, PRIVATE);


  // Turn the LED off
  digitalWrite(led, LOW);
  delay(10000);
}

And here is how I have the Azure IoT Hub setup with custom JSON formatting to get the data fields into separate columns in Azure Table Storage:

Then using this Azure Stream Analytics Query:

I get data in separate columns in the Azure Table Database:

Does that help you any?

2 Likes

It works now, thanks.
The additional JSON data did the trick. I was in the assumption that the JSON I send with

Particle.publish("azureiothub ", payload);

was the same JSON that would arrive at the IoTHub. Your example above showed me how to transform the data first.

I have three observations:

-1 it seems I can not change the addition JSON data once it is save. I have to drop the Integration first and build a new one.
-2 The values in the additional JSON data must be quotes to. Why is this needed:

Now I have to cast my data in my StreamAnalytics:

select (Cast(tempf as float)-32)/1.8 as temperature into S from I

Thanks for your help again

@svelde

Yes, that is a pain in the ass isn't it :smile:

That's a good question for @Dave

What do you mean by that? Why can't you just format the temp to F on the Photon or Electron before sending over to Azure IoT Hub? That's what I do.

What do you mean by that? Why can't you just format the temp to F on the Photon or Electron before sending over to Azure IoT Hub? That's what I do.

Well, this is in fact the Fahrenheit -> Celcius calculation :smile:

Doing the calculation on every photon/device or in the (StreamAnalytics) backend is a matter of taste. I choose to do as much as possible in the backend so on changes, I do not have to redeploy every time. But in this case, a calculation of degrees could be done on the device too, this will not change that many times.

But the real issue is the (unnecessary) string to float casting due to the quotes of the JSON value.

I prefer to do it on the device since its free vs. doing the conversion via a Stream Analytics Query since that cost money.

From my testing, the Stream Analytics service cost adds up quicker than anything else when sending data frequently.

I prefer to do it on the device since its free vs. doing the conversion via a Stream Analytics Query since that cost money.

Yes, I understand. For testing purposes and personal usage (just a few devices) I want to cheapest solution too. But in production, this is hard to manage.

From my testing, the Stream Analytics service cost adds up quicker than anything else when sending data frequently.

A free tier Stream Analytics job, just like the free IoT Hub, would be more than welcome :wink:

Certainly!

I’m planning on scaling to 1,000’s of devices so I have to keep an eye on all the small cost because they really add up when you have that many devices running at the same time.

1 Like

Hi @svelde,

You can send any kind of data to any kind of service using the Photon and Particle Cloud. Webhooks are a convenience to help make that easier, but you certainly don’t need to use that if you need to send something special.

I’ve been thinking about the json template issue that prevents sending native types not in quotes, and I have a good solution to this issue. My hope is that the team can address this in the coming months when they have a chance.

Thanks,
David

4 Likes

Thanks, all this was really helpful. @Dave the non-editable integrations is super-silly. Please fix this! Similarly as @svelde pointed out, proper JSON support so you do not need to quote numeric values would be nice.

1 Like

Hi @jrowe88,

Glad it was helpful! I agree, we’re working on fixing both of these, hopefully sometime in the coming sprints.

Thanks!
David

1 Like

Hi @Dave.

Do you know if Particle team could correct the problem of passing numerical values ​​in the JSON template?

Thanks to @svelde for start the thread and thanks to @RWB for put the example

Regards from Chile, SouthAmerica

1 Like

Hi @rmunoz,

Thanks for asking! I know this has been on our roadmap for a long time, and several fixes have been proposed internally. I don’t know when it’ll get priority yet, but the team is aware of it.

Thanks,
David

Hello, sorry for bringing this up again, but the image for the Azure Stream Analytics Query seems
to be gone. If possible, can you repost it?