the problem is that the value is shown as 4 bytes instead of a single interger. The value you see above is actually 0,0,1,12 which stands for 267. Why dont i get just and integer back?
cool, just wanted to make sure you guys were aware. At first I thought it was totally wrong before I realized that it was a byte array. I also noticed that if you used a negative number on the return value on the Spark.function() call, you get some super large number, can it only be positive?
Thanks, we know about that one too. That’s showing MAX_INT instead of -1, which it ought to display. Same basic issue, different face—we just have to translate all the steps from the binary messages sent by the Core through the Cloud into user-facing JSON.
Just wanted to add I hit this same issue too - I was about to start doing power of 2 stuff on the resulting array (since 0,0,1,12 = 0 + 0 + 255 + 12 = 267) but will wait till the end of the week.
Not exactly, but we put a quick temporary solution in to help people out until we do it right. At least you don’t have to convert that array yourself!
I have a light-dependent sensor set up on my core. When I use cURL to access the LDR from the REST API, I get the following result returned. Is this the result I should expect? What format is the result field and how would I parse that?
I get something similar when I pull the value from my analog thermistor. I simply grab allTypes.number and use that.
{
  “cmd”: “VarReturn”,
  “name”: “temp_raw”,
  “allTypes”: {
    “string”: “\u0000\u0000\bL”,
    “uint32”: 2124,
    “number”: 2124,
    “double”: null,
    “raw”: “\u0000\u0000\bL”
  },
  “result”: “\u0000\u0000\bL”,
  “coreInfo”: {
    “last_app”: “foo”,
    “last_heard”: “2013-12-22T20:42:31.590Z”,
    “connected”: false,
    “deviceID”: “50ff71065067545628110287”
  }
}
Okay. Good to know at least that what was returned seems legitimate. I’ll have to review the docs for the LDR to make some sense of the integer returned. I had expected a number between 0 and 1023, whereas the values in the allTypes.number field are in the 1000 to 3500 range.
I believe the numbers returned will be in the range of 0 - 4095 (4,096 possible values which is 4x 1,024 like you may be accustomed to on an Arduino). The Spark has higher precision analog inputs and uses only 3.3v as well, so a voltage variance could also play into some calculations, though I’m not sure what or how off the top of my head. If you want to stick with 0 - 1023, you can use the map() function like this: map(your_variable, 0, 4095, 0, 1023).
Ahh, that makes sense. I didn’t know/realize that the spark had higher precision analog inputs. I had suspected the 3.3v may play a part in the variance, but don’t know enough about electrical engineering to understand why/how. Interestingly, I had to invert the map function. My LDR readings were increasing with less light. Not sure why that would be.
And just to be extra clear @benddennis and @wgbartley — that allTypes key in the response will go away when we fix this problem for real, so do not depend on it long term. We just knew it would take us a while, and it was easy to add a little help for people in the short term. In fact, just in case, I plan to rename that variable to something like TEMPORARY_allTypes for those who don’t read the forum.