Noob help: Parsing json with SparkJson

I have setup a web hook that hits an API and returns a small snippet of JSON data in the response. That all seems to be working just fine.

Now I am trying to grab those values out of the Json. I've setup SparkJSON to parse the response, but I'm very new to c and I can't seem to figure out how to get the const chars and the chars and the pointers vs strings to work out how this is supposed to work.

Here is my app:

#include "SparkFunMicroOLED/SparkFunMicroOLED.h"  // Include MicroOLED library
#include "math.h"
#include "SparkJson/SparkJson.h"

MicroOLED oled;
StaticJsonBuffer<200> jsonBuffer;
void setup() {
  Particle.subscribe("hook-response/getForecast", myHandler, MY_DEVICES);
  oled.begin();    // Initialize the OLED
  oled.clear(ALL); // Clear the display's internal memory
}

void loop() {
  Particle.publish("getForecast");
  delay(900000);
}

void myHandler(const char *event, char *data) {
  Particle.publish("DEBUG", "Received data");
  JsonObject& root = jsonBuffer.parseObject(data);
  if (!root.success())
  {
    Particle.publish("DEBUG", "parseObject() failed");
    return;
  }
  const char* icon = root["icon"];
  const char* temp = root["temp"];
  printTemp(temp);
}

I get lots of errors like:

/src/weatherstation.cpp:16:72: error: invalid conversion from 'void ()(const char, char*)' to 'EventHandler {aka void ()(const char, const char*)}' [-fpermissive]
void myHandler(const char *event, char *data);

The docs are pretty hand wavy about how pointers are supposed to work and the like. Can someone help get me back on track?

The error is telling you that *char data should be *const char data.

If you fix that, do you still have other errors?

1 Like

That function signature needs to be exactly a shown in the docs.
So instead of

void myHandler(const char *event, char *data) { ...

you have to have it as

void myHandler(const char *event, const char *data) { ...

nothing hand wavy about that :wink:

And the Particle docs are not meant to educate on C/C++ - there are other resources for that.
Particle docs just build on these basics.
For C/C++ problems this forum can help, but we also need some basics to build upon.
Having said this, a specific question is easier to answer than an open one, which may well cover the topic of a whole C/C++ chapter.


Ric beat me to the actual point - that’s for waffling :pensive:

I think there's one more thing to change. data in the line above needs to be cast to char* because that's what the function takes as its argument.

JsonObject& root = jsonBuffer.parseObject((char*)data);

@ScruffR I'm never quite sure when it's legit to just cast in C. Is what I'm showing here ok, or should data be copied to a new char*?

As long you are absolutely sure that this cast will work, it’s fine to do that.
Being sure is the problem :wink:
For objects, you’d need to make sure there is an explicit cast operator implemented or an implicit one will pick the correct one.

This was the piece I was missing. Why do I have to cast this to char*? Isn’t it already char?

@ScruffR I had changed the method signature because I was getting an error using data, and I am almost a total noob at C and pointers.

After you fix the missing const in the function signature of myHandler, data is now a *const char . Since the parseObject function takes a char * instead, you need to cast it to that.

You should not cast away the const modifier!
The const states that this function will not alter the contents of the string, hence passing a const char* into a function that doesn't promis to keep that const char* unaltered causes an error and not just a warning.
Hence the correct way to get around that const char* vs. char* is neither altering the function signature nor casting away the "promised" const, but to copy the const char* string into a char[] and only use that "private" copy for whatever you want.

@Ric, when I said

There is the point, what does parseObject() do with the string you pass?
Will it alter the contents or not? Since the function doesn't promise to keep the string unaltered we'd need to know the exact implementation to tell. But since we don't (usually) know that, it's best not to cast away a const - and if you know one of your own functions won't change it, state the fact in the function signature.

1 Like

@ScruffR, I guess the intent of my question to you was not clear, since the point I was trying to get at was that exact thing; is it ok to cast away the const. Because I knew what I was getting at, I misunderstood your answer. I did start to look through the SparkJson library to to see what that function does with that argument, but couldn’t actually find it in all the files in that library (before it was bed time). I looked again today, but I couldn’t really figure out what they were doing with that value.

This brings up a point I don’t understand. Why is data typed as a const char* in the first place (in the subscribe callback)? I don’t see the reason why you should not be able to modify that data. In fact, I do that a lot using strtok to parse the string. Modifying that string in the subscribe handler obviously doesn’t affect the content of that string in the publishing device, so I don’t understand the implied prohibition.

The data field is even more problematic than just the const char vs. char issue.

Since the underlying buffer is shared between Particle.publish() and Particle.subscribe() handlers and may also be reused before one handler is finished, you should (almost) always pull a copy of the string ASAP to prevent any “parallel” action from tampering with your data.
In turn any manipulation on the buffer from your side inside the handler may corrupt the data of another “parallel” action that gained access to that buffer already (e.g. via Particle.process() call).

But since these conditions are rather rare and “unpredictable” it’s best to keep the original const and get an independent copy to work with.


To check the shared nature of the buffer, try to call Particle.publish(data, event) (swapped order) from within a Particle.subscribe() handler and manipulate the data parameter immediately after the publish call.
Then print out event & data from within the handler and see how the published event looks in the console logs.
(I hope Particle hadn’t already fixed this “problem” tho’ - I’ll have to re-try that to make sure the outcome is still as “surprising” as it used to be)

Is this shared buffer in the cloud? Since things in C are usually passed around by value, I assumed that data would be a copy of what was sent in the publish; so I guess you're saying that's not true.

Why would Particle.process() access this buffer? That isn't something I would have thought was part of its duties.

Anyway, I did try your test, though since it's such a crazy thing to do (swapping name and data, as well as publishing from within the handler), I don't know how relevant it is to real world situations. I guess it does show the shared nature of the buffer, as you pointed out.

So, I had this code on the Photon,

void subscribeHandler(const char* name, const char* data) {
    Serial.printlnf("Before:  name = %s  data = %s", name, data);
    Particle.publish(data, name);
    char* firstPart = strtok((char*)data, ",");
    Serial.printlnf("After:  name = %s  data = %s", name, data);
}

The printout from the Serial monitor was,

Before: name = rdmSwap data = first,second (This is what I sent)
After: name = �a�d��Ya�h��h� data = ��h�

The after printout was different, but alway gibberish, each time I published (which I did from the CLI).

The console always showed the same values, so I don't quite understand why that would be.

event name: first,second
data: efirst,second�efirst

Sorry if this seems super basic, can you give an example of what you’re talking about?

No it's not in the cloud, and this is also not passed around by value, since you are passing the string as pointer to the one buffer in memory (which is by reference).

Not Particle.process() accesses the buffe, but calling it may cause a pending event to be processed/handled and that will access the buffer.
One point I had forgotten to mention why const is required.
Since you can Particle.subscribe() to one event multiple times (e.g. once fully qualified and once only via prefix) you may have cascading handlers that want to see the same data. But if the first handler called messes with the data, the second would not see the original anymore.

1 Like

Ok, thanks, I think I understand it now. I should have realized that data was passed in by reference since it’s a char*, but I got hung up on thinking that the cloud had to pass in the actual data to that method. So, I guess that the cloud passes the data into a buffer somewhere in the device’s memory, and a pointer to that is what gets passed to the handler.

Sorry if I got a little off topic here, but what @ScruffR is saying is that you shouldn’t just cast data to a char*, but should instead make a copy of it, and use that in your call to parseObject().

void myHandler(const char *event, const char *data) {
  int length = strlen(data) + 1;
  char dataCopy[length];
  strcpy(dataCopy, data); 
  Particle.publish("DEBUG", "Received data");
  JsonObject& root = jsonBuffer.parseObject(dataCopy);
  if (!root.success())
  {
    Particle.publish("DEBUG", "parseObject() failed");
    return;
  }
  const char* icon = root["icon"];
  const char* temp = root["temp"];
  printTemp(temp);
}
2 Likes

Ric (and ScruffR), thanks for the help.

I actually ended up figuring out how to do this using methods provided by the json parsing lib to cast their data types to Strings.

void myHandler(const char *event, const char *data) {
  Particle.publish("DEBUG", "Received data");
  JsonObject& root = jsonBuffer.parseObject((char*)data);
  if (!root.success()) {
    Particle.publish("DEBUG", "parseObject() failed");
    return;
  }
  String icon = root["icon"].asString();
  String temp = root["temp"].asString();

  Particle.publish("Temp", temp);
  printIcon();
  printTemp(temp.remove(2));
}

I’m not sure if this is “right”, but it does work! I’ve moved on to a whole other set of problems. :slight_smile:

You are still at risk tho'

You shall not use Particle.publish() before securing the data you received, since Particle.publish() may destroy your data. Even if it works for some cases (system versions) it may break in others.

And as also said, with that (char*)data cast you are violating a C/C++ rule, since you are tampering with data although your function signature promisses you will not alter the data. That is bad practice.

1 Like

I’m using SparkJson and ran into an issue where the buffer size required may be too large for the Photon (when I set it somewhere above 3284 the Photon starts to flash red). I’m not sure where to go from here other than abandon the JSON data structure for a simpler, less human readable structure.

Can someone help me determine the maximum buffer size I can set on the Photon, or think a modification to the SparkJson library that could reduce the required buffer size, or have answer to a question I don’t know how to ask?


Here’s the JSON in question. String length is 366.

{
  "history": {
    "brush twice":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0],
    "dont murder":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
    "no sweets":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0],
    "workout":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
    "sleep by 12am":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0],
    "on time":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]
  }
}

Calculating Buffer Size.
I used the ArduinoJson assistant site to determine the max required buffer size, which according to them comes out to 3920 in the JSON above and 3284 for the smaller JSON example below.

Do note, the BUFFER_SIZE number calculated by the Photon and the site is inconsistent.

// Calculating Buffer Size on Photon
const int CHAR_LENGTH = 284;
const int BUFFER_SIZE = 5*JSON_ARRAY_SIZE(22) + JSON_OBJECT_SIZE(1) + JSON_OBJECT_SIZE(5) + CHAR_LENGTH;

// 3172 w/ CHAR_LENGTH, 2888 w/o. Both worked in my trials below.

Yet the site recommends:

  • AVR 8-bit ** 1252**
  • ESP8266 ** 1756**
  • Visual Studio x86 3200
  • Visual Studio x64 3284

The correct BUFFER_SIZE is probably the one calculated on the Photon (it seems to work, anyways).


My Particle.subscribe() handler:

void handleHistory(const char *event, const char *data) {
  int length = strlen(data) + 1;
  
  // copy char[] out of const since SparkJson needs to write.
  char json[length];
  strcpy(json, data);

  const int CHAR_LENGTH = 366;
  const int BUFFER_SIZE = JSON_ARRAY_SIZE(22) * 6 + JSON_OBJECT_SIZE(6) + JSON_OBJECT_SIZE(1) + CHAR_LENGTH; // 3822
  
  StaticJsonBuffer<BUFFER_SIZE> jsonBuffer;
  JsonObject& root = jsonBuffer.parseObject(json);

  if (!root.success()) {
    Particle.publish("parseObject() failed", String(System.freeMemory()));
  } else {
    Particle.publish("parseObject() success!", String(System.freeMemory()));
  }
}

// This code fails for me, causing the Photon to flash red. 

The library definitely works.

I’ve determined that a smaller test JSON (length 284) parses perfectly when the buffer is set to 3172 or 2888 (both work).

char small_json[] = "{\"history\":{\"brush twice\":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0],\"dont murder\":[1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],\"no sweets\":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0],\"workout\":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0],\"sleep by 12am\":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0]}}";

System.freeMemory() reports 60908 before the buffer is created, and 60960 upon success.

In the particle console I see the webhook response containing the entire json string, and on the Photon I’m publishing the length of that string to confirm it is what I expect.


So far I’m sure that:

  • The parser works for smaller JSONs of similar structure.
  • The entire JSON char[] is making it to the handler’s const char* data
  • Reducing the buffer size significantly below to the size calculated by on the Photon will cause the parse to fail.
  • The Photon’s max buffer size for me is somewhere between 3284 and 3920

It wouldn’t be that hard to parse that yourself without the library if all the data looks like what you are showing. Does the data always show those 5 arrays in that order? Is it only the values in the array that change?

Yup and nope. It’ll always show the arrays in that order, but only as long as I don’t change the order of the objects in NodeJS. Since it’s JSON, the object order isn’t guaranteed and could pretty easily change.

Since I have control over the structure I’d change it to the structure below if I’m going to parse it myself :stuck_out_tongue:

"0000000000000000000110,0000000000000000000000,0000000000000000000010,0000000000000000000000,0000000000000000000010,0000000000000000000000"

My curiosity is what has kept me on the JSON/SparkJson path. I was motivated to collect all of this data to understand what the limits were for the library, and work around them or with them. (It’s been a dream of mine to parse JSON in Particle devices since the Core came out)