I have determined that when using the Spark.variable() for string lengths larger than 9 characters results in a timeout error.
The following code example will set the variable output
to a 9 character string for 10 seconds and turn off the onboard D7 LED, then set output
to a 10 character string for 10 seconds and turn on the D7 LED.
#define UPDATE_INTERVAL 10000
bool state = false;
char output[20];
uint32_t lastTime = 0;
void setup() {
Serial1.begin(9600);
Spark.variable("read", &output, STRING);
pinMode(D7,OUTPUT);
}
void loop() {
if(millis() - lastTime > UPDATE_INTERVAL) {
lastTime = millis();
if(state) {
strcpy(output, "1234567890");
Serial1.println(output);
}
else {
strcpy(output, "123456789");
Serial1.println(output);
}
digitalWrite(D7,state); // Low when 9 chars, High when 10 chars
state = !state;
}
}
If you send a GET request:
https://api.spark.io/v1/devices/DEVICEID/read?access_token=ACCESSTOKENID
ā¦for the output
while the string is 9 chars, it returns fine.
{
cmd: "VarReturn"
name: "read"
TEMPORARY_allTypes: {
string: "123456789"
uint32: 825373492
number: 825373492
double: 1.0300843656201408e-71
float: 2.593151471330657e-9
raw: "123456789"
}-
result: "123456789"
coreInfo: {
last_app: ""
last_heard: "2014-01-06T02:16:03.710Z"
connected: true
deviceID: "xxxxx"
}-
}
But if you request for the output
when it is 10 chars or larger, after 10 seconds it returns a timeout error:
{
error: "Timed out."
}
15 seconds later it drops off the cloud momentarily (fast blinking cyan), and reconnects. Only after this happens can you try to read the 9 character output
successfully. If you manage to try to read it after the 10 char timeout, it will also time out.
Obviously it would be very nice to be able to read strings larger than 9 characters One application would be generating JSON string with multiple key:value pairs, then parsing it in whatever is doing your requesting with something like var obj = JSON.parse(sparkString);