Cloud API is Buggy

I’ve been really trying to get up and running with some applications that utilize Spark.variable and Spark.function however I’m hitting way to many issues on the cloud API.

Given the following code, here are some issues I’ve noticed.

int light;

void setup() {
    pinMode(D6, OUTPUT);
    pinMode(A7, INPUT);
    Spark.variable("light", &light, INT);
}

void loop() {
    digitalWrite(D6, HIGH);
    
    light = analogRead(A7);
}

The API isn’t reporting any variables:

$ curl "https://api.spark.io/v1/devices/<my_core_id>?access_token=<private>"      
{
  "id": "<my_core_id>",
  "name": "snowcone-alpha",
  "variables": [],
  "functions": []
}

Ok, well that’s weird… What happens when we ask for the variable anyway:

$ curl "https://api.spark.io/v1/devices/<my_core_id>/light?access_token=<private>"
{
  "cmd": "VarReturn",
  "name": "light",
  "TEMPORARY_allTypes": {
    "string": "\u0000\u0000\b\"",
    "uint32": 2082,
    "number": 2082,
    "double": null,
    "raw": "\u0000\u0000\b\""
  },
  "result": "\u0000\u0000\b\"",
  "coreInfo": {
    "last_app": "foo",
    "last_heard": "2013-12-24T10:53:10.065Z",
    "connected": false,
    "deviceID": "<my_core_id>"
  }
}

First of all, the core is reporting to be not connected. Also it seems to be struggling with the analog read.

What happens if I just read a random variable (that doesn’t exist)

$ curl "https://api.spark.io/v1/devices/<my_core_id>/asdga?access_token=<private>"
{
  "cmd": "VarReturn",
  "name": "asdga",
  "TEMPORARY_allTypes": {
    "string": " \u0000P\u0000",
    "uint32": 536891392,
    "number": 536891392,
    "double": null,
    "raw": " \u0000P\u0000"
  },
  "result": " \u0000P\u0000",
  "coreInfo": {
    "last_app": "foo",
    "last_heard": "2013-12-24T10:54:40.240Z",
    "connected": false,
    "deviceID": "<my_core_id>"
  }
}

Ok, well that was unexpected. I should be getting an error here.

I had similar difficulty with the Spark.function function.

Hi @nixpulvis - while I agree that the API output should be clearer, this all looks like the type of output you would receive when your Core is not online. That would explain why you don’t see a list of variables/functions, and why it says “connected”: false. If you show the verbose output of the curl request for the variable, does it return a 404?

Not quite.
In the first example given, the core is connected and should have returned the variables.
In testing I found that when not connected, the request for variables will return the empty list and a 200 response.
once this has happened, subsequent requests when the core is online will return the empty list.
some form of caching perhaps?

in the second example, there was a return value from the core. Only it seems that the int is not translated correctly.
I know there is an issue with the variable system, and that the current output is a temporary fix. I think @zachary is working on this, so I would assume this will change in the near future.
The “connected”: false info is wrong or is not about the core being connected to the cloud. It always displays this text.
In testing I found that unconnected behaviour is quite different; when not connected, curl will respond with “error": "Timed out.” and a 408.

The third example looks to me like a default value. This might be an artifact of the temporary fix that is in place now. Once fixed, I would assume the outcome would be an error message.
in testing this resulted in a status 200.

I am noticing issues where if I query for a variable, the “last_heard” field is incrementing on each query, but “connected” reports false. The variable reports the value I expect. Also I can post to a function and the function performs the action I expect. The result of the function POST reports “connected” as true.

can you try this:

char lightValueAsString[10];
sprintf(lightValueAsString,"%d", light)); // convert the analog value to string
Spark.variable("light", &lightValueAsString, STRING); 

and share the result?

Needs a bit of modification, but this does work.

Remember your providing a pointer to the memory you want the spark library to return. If you allocate the memory on the stack in setup(), the pointer in invalid when actually accessed by the spark library sometime after setup() returns. Place the string allocation at module scope in the application.cpp file allocates memory in the .bss region.

Also since your passing a string, the string must be null terminated. Otherwise the software reads off into other memory looking for the end of the string. The first time I tried this the core locked up because I didnt ensure the string was terminated before registering it in Spark.variable().

When allocating a C string, your actually creating an array. Thus to pass a pointer to the base of the array is either lightValueAsString, or &lightValueAsString[0].

If you implement this as:

char lightValueAsString[10];

/* This function is called once at start up ----------------------------------*/
void setup()
{
  uint8_t light = 42;

  lightValueAsString[0] = '\0';
  Spark.variable("light", lightValueAsString, STRING);
  sprintf(lightValueAsString, "%d", light);
}

The result of the query is:

{
  "cmd": "VarReturn",
  "name": "light",
  "TEMPORARY_allTypes": {
    "string": "42",
    "uint32": null,
    "number": null,
    "double": null,
    "raw": "42"
  },
  "result": "42",
  "coreInfo": {
    "last_app": "foo",
    "last_heard": "2013-12-24T18:37:19.075Z",
    "connected": false,
    "deviceID": "53ff70065067544848500587"
  }
}

Of course its still reporting “connected” : false which I don’t quite understand.

$ curl "https://api.spark.io/v1/devices?access_token=<private>"
[
  {
    "id": "<id>",
    "name": "snowcone-alpha",
    "last_app": null,
    "connected": true
  }
]

The part about "last_app": null" is concerning.

Let’s see what it says about my core.

$ curl "https://api.spark.io/v1/devices/<id>?access_token=<private>" -v
* Adding handle: conn: 0x7faaf2004000
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* - Conn 0 (0x7faaf2004000) send_pipe: 1, recv_pipe: 0
* About to connect() to api.spark.io port 443 (#0)
*   Trying 54.236.123.218...
* Connected to api.spark.io (54.236.123.218) port 443 (#0)
* TLS 1.0 connection using TLS_RSA_WITH_AES_128_CBC_SHA
* Server certificate: api.spark.io
* Server certificate: COMODO High-Assurance Secure Server CA
* Server certificate: AddTrust External CA Root
> GET /v1/devices/48ff6c065067555024221587?access_token=<private> HTTP/1.1
> User-Agent: curl/7.30.0
> Host: api.spark.io
> Accept: */*
> 
< HTTP/1.1 200 OK
< Access-Control-Allow-Origin: *
< Content-Type: application/json; charset=utf-8
< Date: Tue, 24 Dec 2013 23:18:49 GMT
* Server nginx/1.4.2 is not blacklisted
< Server: nginx/1.4.2
< X-Powered-By: Express
< Content-Length: 104
< Connection: keep-alive
< 
{
  "id": "<id>",
  "name": "snowcone-alpha",
  "variables": [],
  "functions": []
* Connection #0 to host api.spark.io left intact
}

Ok well I got a 200 but it still says no variables.

$ curl "https://api.spark.io/v1/devices/<id>/light?access_token=<private>"   
{
  "cmd": "VarReturn",
  "name": "light",
  "TEMPORARY_allTypes": {
    "string": "\u0000\u0000\b�",
    "uint32": 2240,
    "number": 2240,
    "double": null,
    "raw": "\u0000\u0000\b�"
  },
  "result": "\u0000\u0000\b�",
  "coreInfo": {
    "last_app": "foo",
    "last_heard": "2013-12-24T23:21:00.980Z",
    "connected": false,
    "deviceID": "<id>"
  }
}

Hi @nixpulvis - are you sure that your code was successfully flashed to the Core, and that you are communicating with the same Core that you flashed?

Yes, and yes. The reason for turning D6 on was to verify this. I’m also pretty sure I’m making https requests to the right core, because I’m copying the ID straight from the browser.

Variables are not currently reported by Cores. This is a feature in progress as @roderikv says.

The last_app key is also a planned feature that’s not fully implemented, so don’t let it concern you.

How in the world would one (the new user) know this? Is there somewhere one can access to find the build status of the Spark core functions? I know that it still the early days and I must be patient, But it is difficult if I write some code and it doesn't work as I believe that I did something wrong and then spend much time trying to understand what I did that is not correct , only to find that a function is not yet supported... I was an very early adopter of the ChipKit Uno32 and understand the scope of what is being attempted by the Spark team.

On the bright side, I really like this Spark!! A great idea and so far very well implemented.

David G.

2 Likes

Thanks for the feedback @sierrasmith71! We’ve heard that folks would love to know what we’re working on in the moment. Right now, the best way is to check blog.spark.io where we summarize new features at the end of each sprint.

They are now! :slight_smile: This is the result of a https://api.spark.io/v1/devices/xxxxxxxx

{
"id": "xxxxxxxxxxxxxxxxxxxxxxxxxx",
"name": null,
"variables": {
"uptime": "int32",
"data": "int32"

},
"functions":
}

3 Likes

And how do we get the value rather than seeing “int32”?

When you call Spark.variable(), it will return whatever you have programmed it do return.

http://docs.spark.io/firmware/#spark-variable