InfluxDB/Telegraf Webhook JSON error

I am getting an error from Telegraf when I send information from Particle through the webhook. I publish the following in my Photon’s firmware:

int c=10;
String data=String::format("{ \"tags\" : {\"id\": \"%s\", \"location\": \"%s\"}, \"values\": {\"capacity\": %d}}", "test", "myLoc", c);
Particle.publish("capacity", data, PRIVATE);

This publishes a valid JSON. In the console, I see the photon publishing the following:

{"data":"{ \"tags\" : {\"id\": \"test\", \"location\": \"myLoc\"}, \"values\": {\"capacity\": 10}}","ttl":60,"published_at":"2018-06-21T03:15:15.834Z","coreid":"3c0038000c47363433353735","name":"capacity"}

This is the JSON type I have set up for the webhook:

{
  "event": "{{{PARTICLE_EVENT_NAME}}}",
  "data": "{{{PARTICLE_EVENT_VALUE}}}",
  "coreid": "{{{PARTICLE_DEVICE_ID}}}",
  "published_at": "{{{PARTICLE_PUBLISHED_AT}}}",
  "measurement": "influxdata_sensors"
}

However, when I run ‘systemctl status telegraf’ in the terminal of my AWS instance, I see an error: “json: cannot unmarshal string into Go struct field event.data of type particle.data”

I’m pretty sure I followed all the necessary steps. I’m not sure why telegraf cannot parse the json that is sent to it. I took a look at the telegraf.go code on github. The event struct and data struct look like this:

type event struct {
	Name        string `json:"event"`
	Data        data   `json:"data"`
	TTL         int    `json:"ttl"`
	PublishedAt string `json:"published_at"`
	Database    string `json:"measurement"`
}

type data struct {
	Tags   map[string]string      `json:"tags"`
	Fields map[string]interface{} `json:"values"`
}

I think telegraf can’t parse the data tag in the json from a string (from the photon) to a map in the data struct.

Please let me know what I can do to fix this. Thanks!

You need to escape embedded double quotes in strings (which wasn't shown before reformatting).
I've reformatted your code to make the escaped symbols show up correctly.

See here on how to do that

ScruffR,
I did escape the embedded double quotes in the string I’m publishing. Anyway, it still gives the same error even after I put in the reformatted code you provided.

Which JSON is being sent to Telegraf? Is it the JSON in the form of the string I’m publishing in the firmware:

{"data":"{ \"tags\" : {\"id\": \"test\", \"location\": \"myLoc\"}, \"values\": {\"capacity\": 10}}","ttl":60,"published_at":"2018-06-21T03:15:15.834Z","coreid":"3c0038000c47363433353735","name":"capacity"}

or is it the JSON I configured in the Particle webpage for webhooks:

{
  "event": "{{{PARTICLE_EVENT_NAME}}}",
  "data": "{{{PARTICLE_EVENT_VALUE}}}",
  "coreid": "{{{PARTICLE_DEVICE_ID}}}",
  "published_at": "{{{PARTICLE_PUBLISHED_AT}}}",
  "measurement": "influxdata_sensors"
}

Seeing the same problem here.

@pique did you ever try the test button in the particle console? That is failing for me, so I’ve not actually tried generating data on particle device yet.

I’d not rely on the result of the TEST button since it only sends a “blank” request to the server which would be refused by most hosts.

added some instrumentation to telegraf and this is what is coming in:

{"measurement":"templogger","data":"{\"tags\":{\"id\":\"bec\"},\"values\":{\"temp\":77.54000092,\"hum\":54.60058594}}"}

So data is looking like a string instead of a json object, hence the warning:

json: cannot unmarshal string into Go struct field event.data of type particle.data.

Its not obvious to me how to instruct the particle cloud to send JSON that looks like:

{"measurement":"templogger","data":{"tags":{"id":"bec"},"values":{"temp":77.54000092,"hum":54.60058594}}"}

If I print out what the device is sending, I see:

{"tags":{"id":"bec"},"values":{"temp":77.5039978,"hum":52.69042969}}

So not sure where the escaping of " characters is happening.

It appears that data is always sent as a string (Publish/Subscribe and JSON data in events), so its puzzling to me how the influxdata telegraf webhook can ever work. Did something change, or am I missing something?

1 Like

cbrake,

Yeah, I still haven’t been able to figure it out.

You’re right–I think it has to do with the data being sent as a String, not a JSON object that can be parsed correctly. Hopefully we can get some advice soon.

There might be a misconception with escaped characters.
When you construct a string (in C/C++, JAVA, ...) that contains double quotes you just write "...\"..." to instruct the compiler not to treat the nested double quote as string delimiter, but inject the following character as-is. However, the backslash (\) never becomes part of the created string.
Consequently the receiving end also does not see that backslash in the string, but depending on implementation it might display the string with the escape character (/) in place or not, that's up to the developer's intent (display raw string vs. display as it would be written in code).

What may pose a problem tho' is when you are sending or seeing mixed strings like the one you posted above

Some double quotes are escaped, others aren't - that's inconsistent and hence is supposed to fail.

To pinpoint the origin of the problem you may need to show the code that constructs and publishes the string and also how your webhook is declared. Only the combination of that may reveal where the problem gets introduced.
One thing to make sure with a JSON payload is that your webhook should most likely be set to
Request Format: JSON and Advanced Settings -> JSON DATA: Custom in order to properly unpack the event data and forward to the server

ScruffR,

It seems like I am making the same mistake as cbrake. The actual JSON that is being published has a mix of escaped quotes and regular quotes. As you said, some double quotes are escaped and others aren’t, causing the JSON to not be unmarshaled:

{"data":"{ \"tags\" : {\"id\": \"test\", \"location\": \"myLoc\"}, \"values\": {\"capacity\": 10}}","ttl":60,"published_at":"2018-06-26T22:36:24.431Z","coreid":"3c0038000c47363433353735","name":"capacity"}

This is how my current Custom webhook configuration is set up:

{
  "event": "{{{PARTICLE_EVENT_NAME}}}",
  "data": "{{{PARTICLE_EVENT_VALUE}}}",
  "coreid": "{{{PARTICLE_DEVICE_ID}}}",
  "published_at": "{{{PARTICLE_PUBLISHED_AT}}}",
  "measurement": "influxdata_sensors"
}

How should I change this to unpack the event data correctly?

In a custom JSON you should use the nested fields of {{{PARTICLE_EVENT_VALUE}}} to create your own custom JSON.
Currently there seems to be an issue with webhook management, so I can’t setup a test webhook to manufacture a showcase.

It would be very helpful if you could provide an example.

How can I access the nested fields of {{{PARTICLE_EVENT_VALUE}}} and accordingly change my custom JSON?

It depends whether you can accept your numeric values to be wrapped in double quotes or not.
If so, you can use Custom JSON, otherwise you need to use Custom Body like this

And write the code somewhere along the line of this

  char data[255];
  snprintf(data
          ,sizeof(data) 
          ,"{\"id\": \"%s\" ,\"loc\": \"%s\", \"cap\": \"%d\"}"
          ,            id   ,           loc  ,           i++
          );
  Particle.publish("RequestBin", data, PRIVATE);

With a custom JSON/Body template you can keep the data used for the actual event publish as short as can be, since the static extras (like your tags and values “containers” or any other static field you may be needing) can be added by the cloud via the template specification.

So the above code produces this event


and the webhook integration results in this data on the receiving server
image

I was just struggling with this and finally found a solution that I don’t see mentioned here.
As stated above by @cbrake, the value of data is being passed as a string instead of a JSON object. All the examples show the JSON data format looking like this:

...
  "data": "{{{PARTICLE_EVENT_VALUE}}}",
...

If you simply remove the quotes around the PARTICLE_EVENT_VALUE variable, the data is sent as a JSON object and the Telegraf parser works. Here is my working JSON format:

{
  "event": "{{{PARTICLE_EVENT_NAME}}}",
  "data": {{{PARTICLE_EVENT_VALUE}}},
  "published_at": "{{{PARTICLE_PUBLISHED_AT}}}",
  "measurement": "particle"
}

Note these caveats:

  • I can’t tell how measurement is used. The measurement in the InfluxDB is actually the event name in my experience.
  • The Webhooks JSON editor will show a red X on your data line indicating a Bad string. You can ignore this. It will still work.
1 Like

Hey I couldn’t make it work.
Here is my webhook detail


As mentioned in the docs, the URL ends with http://details/particle
The event name is particle_sensor.

Here is the firmware

int i=0;
char data[255];
void setup() {
    
}

void loop() {

int c=10;
String data=String::format("{ \"tags\" : {\"id\": \"%s\", \"location\": \"%s\"}, \"values\": {\"capacity\": %d}}", "test", "myLoc", c);
Particle.publish("particle_sensor", data, PRIVATE);
Particle.publish("data", "data", PRIVATE);
  // Wait for 3 seconds
  delay(5000);

}

Is there anyone could help me on this?
Regards

@Rahul_G, can you tell me what specifically is not working? Are you seeing the event in the Particle event logs indicating that your device is successfully publishing and the webhook is seeing it? If so, do you see any attempted communication with your telegraf instance? Are you specifying the port in the URL (e.g. http://url:1619/particle)? I found it useful to look at the logs on my telegraf instance once I knew I was getting that far.

Be sure to check what database telegraf is writing to (the database argument in telegraf.conf). The measurement in that database will then match the event argument in your webhook.

@trpropst Thanks for the response. The event is not even publishing in the particle console. I think this is because of some configuration issue in webhook.
We moved from Telegraph(for influxDB) to Azure IoT hub and everything seems to be working fine.

  • configuration is which I provided for the webhook.

Probably not. You'd usually see the event published in console whether or not the webhook works.
The webhook and console are subscribing to the events in parallel, not sequentially.

Either your event is not published by the device at all or you are looking at the wrong device or account in console.
To check which, go to Particle Console | Build your connected product and then put your device in Safe Mode. If you see the system events from your device, you are logged in with the correct account.
Then reset and run your code and see whether your expected events can be seen. If not, your code isn't publishing as it should.

This is how I came to the conclusion.
This is not working:
Particle.publish("particle_sensor", "data", PRIVATE);
but this works fine:
Particle.publish("data", "data", PRIVATE);
I have only one device in the account I used for flashing.

@rickkas7 A device may not publish events beginning with a case-insensitive match for "spark". Such events are reserved for officially curated data originating from the Cloud. from here

Is perhaps “particle” also an excluded name?

3 Likes