Using spark.publish() json format

Greetings Particle community!

I have had great success using Spark.publish() and webhooks in tutorials. However I seem to be stuck.
I’m am now trying to POST using a different API. The json format should look like this:

  "time": 233434,
  "temp": 84.555

My current webhook (without the Bear token) looks like this:

    "eventName": "kelvin",
    "url": "",
    "requestType": "POST",
    "headers": {
	"Authorization": "Bearer 12345",
        "Content-Type": "application/json"
    "json": { 

              "time": "{{time}}",
              "temperature": "{{temperature}}"
    "mydevices": true

I think I feel good about the webhook but I’ve been struggling on the firmware side. I’ve tried this (and many other things):

   #include "Adafruit_DHT/Adafruit_DHT.h"

// Define Pins
    #define DHTPIN 2     // what pin we're connected to

// Setup Sensor
    #define DHTTYPE DHT22		// DHT 22 (AM2302)

    #define publish_delay 10000
    unsigned int lastPublish = 0;
void setup() 
	RGB.brightness(10); // This sets the RGB LED brightness : 0-256
    RGB.control(false); // release the LED to the system
    	dht.begin();		// Startup the sensor

void loop(){

    // grab some data
        int time =;
        float tempf = dht.getTempFarenheit();
        String jsonDataString = String( "{ \"time\":" + time + ",\"temperature\":" + tempf +"}");
        unsigned long now = millis();

      if ((now - lastPublish) < publish_delay) {
        // it hasn't been 10 seconds yet...
      Spark.publish("kelvin", jsonDataString, 60, PRIVATE);

      lastPublish = now;

the error message: empwebhook.cpp:42:64: error: invalid operands of types ‘const char*’ and ‘const char [16]’ to binary ‘operator+’

I feel like I’m really close. I think I need to convert the float and int to a string? I’ve tried that a few different ways but then it’s unclear to me how to format to the correct json template posted above int my Spark.publish()
I’ve read through most of the documentations and I just cant seem to put this together.
Any suggestions or advice would be greatly appreciated.

Thanks in advance for your time!

I’ve edited your post to properly format the code. Please check out this post, so you know how to do this yourself in the future. Thanks in advance! ~Jordy

You’re almost there! You just need to format the numbers to String first.

String jsonDataString = String( "{ \"time\":" + String(time) + ",\"temperature\":" + String(tempf) +"}");

One word of advice: watch the length of the JSON string. Spark.publish() can only send 63 characters at this time. If the JSON string is longer, it will be truncated and you will get an error while trying to parse it when receiving the webhook (ask me how I know…)


Thank you for your replay. Very helpful and exactly was I was looking for.
The hook response back is:

"name":"kelvin","data":"{ \"time\": 1435000962,\"temperature\": 28.700003}","ttl":"60","published_at":"2015-06-22T19:22:42.929Z","coreid":"53ff71066667574852162467"}
{"name":"hook-response/kelvin/0","data":"{\"error\":\"invalid data\"}","ttl":"60","published_at":"2015-06-22T19:22:42.998Z","coreid":"undefined"}

And now I’m wondering if it’s because length of the JSON string as you had mentioned.

I’m not sure if the “invalid data” hook response comes from the Spark cloud or your application…

In my case I skirt this issue by using the JSON sent directly from the Spark Core without reformatting it in the webhook.json definition file. The trick is to use an empty “json” key in webhook.json.

My web app gets a POST with a field called “value” containing a string. On the server I do JSON.parse(value) to get the hash.

    "eventName": "hm_",
    "url": "",
    "requestType": "POST",
    "auth": {
      "username": "particle",
      "password": "mypassword"
    "json": {
    "mydevices": true