[Resolved] Exposed function receiving the wrong values?

Hello,
I have three exposed spark functions and am using python request to perform REST calls. Two of the three functions work fine; However, the third seems to be receiving the incorrect bytes (I am trying to send raw byte values as opposed to a String).

Here is the function setup:

  • Function definition:

      int setTag(String set);
    
  • Expose Call in setup():

      Spark.function("setTag", setTag);
    
  • Function:

      int
      setTag(String set)
      {
          uint8_t request[64];
          memset(request, 0x00, 64);
          set.toCharArray((char *)request, 64);
          
           //Do stuff with received information
         
          return 0;
      }
    

My request use is as follows:

head = {'Authorization': 'Bearer <access token>'}
setTag = 'https://api.spark.io/v1/devices/<device ID>/setTag
setReq = {'args=': b'\x01\x01\x01\xa6\xb8\xe1\x2c\xd1\x81\x01\x0f\xc3\x85\x14\x97\x5d\x60\x27\x12'}
r = requests.post(setTag, data=setReq, headers=head)

Two of the three exposed functions receive the correct bytes; However, the third (the one listed above) seems to get miscellaneous information (starting at the first byte). I wonder if there is some exposed function restriction that I am not taking into account?

The total bytes of the message I send is no more than fifty three, which should be find since the limit is sixty four bytes?

The other two functions that receive information only use three bytes.

Any help is appreciated,
Thanks!

What happens if you only send three bytes to the 3rd function? Does it work then?

It seems to work up to three. After three I get the miscellaneous bytes (effectively stops working at byte \xa6)

Edit- And in fact, if I replace \xa6 with \x01, it works fine… Maybe there is an issue with processing bytes that are out of range of the ASCII table?

I can’t see a return statement in your function.

What value do you expect to be returned by this function if you don’t call return x;?

Edit: I’ve just seen you edited the topic from “Exposed function returns the wrong values” to “… receives …” :wink:

Sorry, I just left it out when I copy/pasted. I edited an equivalent return value into my original post.

I don’t know, but just for clarity could you just add a '\0' at the end of your data.

After doing some searching it seems that the byte string is being encoded into (most likely) UTF-8, can only handle values up to 128 for a single byte.

From Python 2.7.9 docs:

UTF-8 uses the following rules:
1) If the code point is <128, it’s represented by the corresponding byte value.
2) If the code point is between 128 and 0x7ff, it’s turned into two byte values between 128 and 255.
3) Code points >0x7ff are turned into three- or four-byte sequences, where each byte of the sequence is between 128 and 255.

Does anyone know of a way to get around this? Maybe a different encoding? Admittedly I don’t know much about string encoding or REST.

Would you mind posting your python code then? It sounds like that’s where we need to start looking for a solution

Not at all! Thanks for the help.

import requests
import json
import time


def main():
    #Generic info
    head = {'Authorization': 'Bearer <access token>'}
    url = 'https://api.spark.io/v1/devices/<device ID>/'

    #getTag request
    getTag = url+'getTag'
    getReq = {'args=': b'\x01\x01\x01'}
    r = requests.post(getTag, data=getReq, headers=head)
    print r.text

    time.sleep(1)

    #set tag info
    setTag = url+'setTag'
    setReq = {'args=': b'\x01\x01\x01\xa6\xb8\xe1\x2c\xd1\x81\x01\x0f\xc3\x85\x14\x97\x5d\x60\x27\x12'}
    r = requests.post(setTag, data=setReq, headers=head)
    print r.text

    time.sleep(1)

    #getTag request
    r = requests.post(getTag, data=getReq, headers=head)
    print r.text

if __name__ == '__main__':
    main()

Edits: Sorry, had some miscellaneous stuff I was trying out, which I removed.

I looked through the requests module and i’m not sure if what you want to do is do-able with requests. Have you considered trying to use tcp directly? Here’s a useful resource I’ve used before: http://pymotw.com/2/socket/tcp.html

To get around the issue I started sending a character string and decoding the characters once received on the spark core; However, I didn’t realize you could do TCP Servers on the Spark Core. I think this would be preferred over my method. Thanks!

Just as an update: TCP server worked perfectly. Thanks for the suggestion, I didn’t even realize the Spark Core was capable of doing it.

It really is a great device. And the staff are extremely quick and thorough, thanks again!

1 Like