emc2
January 5, 2014, 7:56pm
1
I’m trying to read a variable value using the Cloud API, but everything I read seems to come back as the same raw value of " P". Any idea as to why?
https://api.spark.io/v1/devices/ /counter1?access_token=(will change token at a later date, try now for troubleshooting)
Code:
int counter1 = 0;
int counter2 = 53;
void setup() {
pinMode(D2, OUTPUT);
}
void loop() {
digitalWrite(D2, HIGH);
delay(250);
digitalWrite(D2, LOW);
delay(1000);
counter1++;
counter2++;
}
Response:
{
cmd: "VarReturn"
name: "counter1"
TEMPORARY_allTypes: {
string: " P"
uint32: 536891392
number: 536891392
double: null
float: 1.0868491504456741e-19
raw: " P"
}-
result: " P"
coreInfo: {
last_app: ""
last_heard: "2014-01-05T19:39:03.987Z"
connected: true
deviceID: "53ff67065067544825071287"
}-
}
If you check that URL out in a browser, the raw value actually seems to be " \u0000P\u0000"… the app I’m using to format GET/POST requests formats the unicode I guess.
Why does the GET not return the proper value of counter1 or counter2?
BDub
January 5, 2014, 8:00pm
2
You need to add these two lines to your setup()
Spark.variable("counter1",&counter1,INT);
Spark.variable("counter2",&counter2,INT);
Should work after that
Also if you ever see that string: " P"
you know you have an error in how you are trying to read the variable. It should probably give you an error and say that the requested varKey is undefined
emc2
January 5, 2014, 8:03pm
3
Ugh… the learning, it hurts so good! This has probably been asked a million times, but is there a section of documentation that contains this particular feature? I can’t shake the feeling that I’m missing something huge.
Thanks a ton for the pointer
BDub
January 5, 2014, 8:04pm
4
No problem! Here’s a link to the Core Firmware Reference:
http://docs.spark.io/#/firmware/cloud-spark-variable
1 Like
emc2
January 5, 2014, 8:06pm
5
Holy… yep, missed those examples. You guys rock, thanks again!
1 Like
emc2
January 5, 2014, 8:20pm
6
So… uh… do float values work?
Code:
float h; // humidity
Setup:
Spark.variable("h",&h,FLOAT);
Error:
error: 'FLOAT' was not declared in this scope
BDub
January 5, 2014, 8:29pm
7
Nope, but DOUBLE will compile… however won’t give you a valid value yet from the cloud. A fix is in the works!
For now you can do this, and send you DOUBLE or FLOAT as a string:
double temperature = 1.0;
char myStr[10];
void setup() {
Spark.variable("read", &myStr, STRING);
}
void loop() {
temperature += 0.5;
sprintf(myStr,"%.2f",temperature);
delay(500);
}
1 Like
emc2
January 5, 2014, 8:44pm
8
Absolutely beautiful . Humidor Tweetbox 1.0-beta complete! Many thanks
1 Like