The JSON file has a LOT of garbage in it that I do not need. In fact, the JSON file that my web API return goes over the character limit that the particle cloud allows. The JSON file contains 4 arrays of data, and I only need the objects contained in one of the arrays. I am using a mustache template to get the objects from the array. The array is named “data”. My mustache template looks like this: “{{#data}}”. This template returns “[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]” as a sting. As you can see, there are 6 objects inside the data array. Is it possible to get mustache to return nested JSON data in this manner?
Yes. It will be something like {{#data.0.XXX}} where XXX is the inner element name and 0 is the first array index, 1 is the second array index, etc…
I wrote up a tutorial here:
That would work if I wanted to return specific individual values. I need the mustache template to return full JSON if that’s even possible.
I need to return (the data array)
[
{ "key": 0, "message": "message 0" },
{ "key": 1, "message": "message 1" },
{ "key": 2, "message": "message 2" },
{ "key": 3, "message": "message 3" }
]
FROM
{
"junk": [
{ "key": 0, "junk": "stuff" },
{ "key": 1, "junk": "stuff" },
{ "key": 2, "junk": "stuff" }
],
"data": [
{ "key": 0, "message": "message 0" },
{ "key": 1, "message": "message 1" },
{ "key": 2, "message": "message 2" },
{ "key": 3, "message": "message 3" }
]
}
using the template {{#data}} returns:
[object Object],[object Object],[object Object],[object Object]
Okay, I think I have a better understanding now. As long as the number of elements in the array is fixed and you only want message, you could use something like this:
[
{ "key": 0, "message":"{{data.0.message}}" },
{ "key": 1, "message":"{{data.1.message}}" },
{ "key": 2, "message":"{{data.2.message}}" },
{ "key": 3, "message":"{{data.3.message}}" }
]
Otherwise, it’s probably not possible.
I need to cache the keys and the messages on the photon at startup in a vector map from the web server. That way I only need to hit the web server once (this is for reliability). If my web server goes down, and the messages are needed by the photon, they will be nicely cached into a c++ object. And it’s probably worth mentioning that the keys are not always going to be sequential. If that were the case, this would be much easier!
Plan A:
I used mustache to render the keys and messages into a multi-delimited string, e.g.:
"0~Normal|1~Test|2~Trouble|4~Alarm"
That part was easy, however, parsing this into vectors, then pushing those vectors into a map using strtok() was proven to be way over my head. Any other suggestions would be greatly appreciated, thanks for your help so far.
What about something like this:
Sample data:
{
"junk": [
{ "key": 0, "junk": "stuff" },
{ "key": 1, "junk": "stuff" },
{ "key": 2, "junk": "stuff" }
],
"data": [
{ "key": 0, "message": "Normal" },
{ "key": 3, "message": "Alarm" },
{ "key": 1, "message": "Test" },
{ "key": 2, "message": "Trouble" }
]
}
Mustache template:
{{data.0.key}},{{data.0.message}},{{data.1.key}},{{data.1.message}},{{data.2.key}},{{data.2.message}},{{data.3.key}},{{data.3.message}}
Sample decoded data:
0,Normal,3,Alarm,1,Test,2,Trouble
Code:
#include "Particle.h"
void parseInput(const char *dataIn);
void setup() {
Serial.begin(9600);
parseInput("0,Normal,3,Alarm,1,Test,2,Trouble");
}
void loop() {
}
void parseInput(const char *dataIn) {
char *dataCopy = strdup(dataIn);
char *cp = strtok(dataCopy, ",");
for(size_t ii = 0; ii < 4 && cp != NULL; ii++) {
int key;
key = atoi(cp);
cp = strtok(NULL, ",");
if (cp != NULL) {
Serial.printlnf("ii=%d key=%d data=%s", ii, key, cp);
cp = strtok(NULL, ",");
}
}
free(dataCopy);
}
Serial output:
ii=0 key=0 data=Normal
ii=1 key=3 data=Alarm
ii=2 key=1 data=Test
ii=3 key=2 data=Trouble
After several hours, i finally figured it out. Because the number of messages was variable, and the keys are not always sequential, I ended up having to parse
"1`message1~2`message2~3`message3~4`message4"
into a std::map.
It works, but now my brain hurts.
Thanks for your help! This is an awesome community.