Using Spark.publish() with Simple JSON Data

After reformatting your code it seems as if there is an error with your curly braces in the last function too.

You are using "%u" but have signed int as receiving variables. Is this intentional?

analogWrite() on a non DAC pin like your power (A5) supports max 255 and not 4095 as you use in this

 analogWrite(power,4095);

What's the point of the first printf() in this?

    if (needUpdate == 1) {
        sprintf(message,"%d,%d,%d,%d",lightValue,soundValue,moveValue,pushValue);
        sprintf(message,"%d",lightValue);
        Particle.publish(myName,message);
        needUpdate = 0;
    }

What exactly "don't work"?`

    Particle.publish("debug2",message); // HERE IT DON'T WORK...

I can't see this in your post.
What does the string you actually receive look like?
What do you see published as debug1 and debug2?
We can guess what it should look like from reading the code and we could flash your code to our own devices and try, but since you already got that data, just post it.
Also with a description what you'd expect it to look like.

1 Like