Unexpected sprintf() result from GET request

I’ve had success to date using GET and POST requests using javascript on a web page. I’m now trying to combine variables in a data string and parse the results in javascript but I’m getting an unexpected result using sprintf() in my core.

In the Spark Core I have:

Spark.variable("getCoreData", &strCore, STRING);

sprintf(strCore,"{\"TempAct\": %.1f, \"TempSet\": %i, \"TimerVal\": %i}", temperature, iTempTarg, TimeLeft);

When I send the get request I get the following result:

{
“cmd”: “VarReturn”,
“name”: “getCoreData”,
“result”: “{“TempAct”: 19.1, “TempSet”: 550, “TimerVal”: 3317}”,
“coreInfo”: {
“last_app”: “”,
“last_heard”: “2014-12-03T19:18:53.943Z”,
“connected”: true,
“deviceID”: “[My device ID”
}
}

I was expecting to get a return like “{“TempAct”: 19.1 … }”, not “{“TempAct”: 19.1 …}”

Can anyone point me in the right direction?
Is there a good tutorial on sprintf() and JSON as it applies to Spark? I’ve found some good short tutorials or complex references but not a lot of examples that bridge the two.

I think Spark API escapes quotes in the string, otherwise it would lead to JSON parse errors. If the string would be used as is:

"result": "{"TempAct": 19.1, "TempSet": 550, "TimerVal": 3317}",
------------^ unescaped quote

It might be understood as:

"result": "{"<missing comma>
"TempAct": 19.1,
"TempSet": 550,
"TimerVal": 3317}", <-- malformed value

As the value of result is string and not an object, API might want to ensure it will be parsed correctly as a string. You might test it sending value of strCore to serial console.

Hi @arniew

You are doing it right. Just call JSON.parse() on that string and it will be fine. It gets escaped again when printed as a return from the cloud.

Thanks, yes it is working fine.

For the sake of completion, the following code snippet performs the parsing that was needed.

var obj = JSON.parse(json.result);
document.getElementById("curTemp").innerHTML = (obj.TempAct).toFixed(1) + "&deg;C";