Spark.variable "Variable not found" issue - Solved

I am having issues with the following code to update a Spark variable, The non-testing version of this worked earlier today. Anyone have any ideas?

Many thanks,

  • Brian

HTTP request:

https://api.spark.io/v1/devices/"my_device"/result?access_token="my_token"

Response:

{
   "ok": false,
   "error": "Variable not found"
}

Code snippet:

char resultstr[128] = "testing right now";
Spark.variable("result", &resultstr, STRING); 
sprintf(resultstr, "test");

Full code:


 /* 
Publish test
*/

char resultstr[128] = "testing right now"; //Output string

void setup() {
    //Implicit online connection occurs here
    Spark.publish("Reset", "Starting up...", 60, PRIVATE); //Message to particle.io dashboard that we are starting up
    //Constant reboot recovery: OTA firmware won't update without this
    for (int i=0; i <= 5; i++){ //Add 10 seconds to bootup; going from delay to delay should allow for OTA firmware update is pending
        delay(2000); 
	}
    Spark.variable("result", &resultstr, STRING); //Create online variable to output time/temp result
}


void loop() {
  Spark.publish("Test", "test", 60, PRIVATE); //Message to particle.io dashboard
  sprintf(resultstr, "test"); //Publish info to spark var
  
  delay(10000); //10 second loop delay
  
  //reboot logic - mitigation for connection issues
  if (millis() > 5*60*60000UL) { //5 hours * 60 min * 60,000 millis/sec (Note: millis starts at 0 when starting up)
     Spark.publish("Reset", "Time to reboot...", 60, PRIVATE); //Message to particle.io that we are shutting down
     System.reset(); //Like hitting the reset button on photon
    }
}

First off, drop the "&" for the char array--it is already the right kind of thing for a Spark.variable.

  Spark.variable("result", resultstr, STRING); //Create online variable to output time/temp result

Now make sure "my_device" and "my_token" are the hex values for your device ID and token (both are available in the web IDE). You can use the "name" of your device that you put in, such as, PhotonOne. You don't need the double-quotes.

bko,

Thanks for your suggestion. I was able to solve this issue by moving the Spark.variable call to the beginning of the setup() function.

  • Brian