[SOLVED] Problem POSTing from Python

I started running into the same problem yesterday. I haven’t had time to do more testing, but it initially seems like Spark.function() works if you are using Spark.publish(). I couldn’t even get Spark.function() to work after commenting out the Spark.publish() stuff, but that’s about as far as I got. If I get some down time at work today, I will investigate farther.

I don’t think they are necessarily related at all. I’m getting the publish SSE messages just fine. The POST simply isn’t working for some reason. It works with curl, so I think it’s something dumb in my code.

My Python isn’t the best, but I’d suggest using something like RequestBin instead of the Spark api endpoint. Make the same request from CURL as you’re making from Python, and compare them for differences. That might help you track down the issue.

Thanks, I’ll give it a shot.

Hey Guys!

We’ve had some reports that Python and the CLI might both url-encode the arguments POSTed to your spark function. So instead of “Playa Guiones” you might get “Playa%30Guiones”. I need to look into this and see if it makes sense to urldecode these parameters or what. I think generally speaking variables passed via a POST shouldn’t be encoded, but I could be wrong about that. – Worth checking, or piping the arguments you receive in your function to serial.

Thanks!
David

Thanks Dave. The problem is that, according to the serial output on my Core, I’m not receiving anything but an empty string when I hit the URL with Python, but I do receive urlencoded data (that I decode on the core) when I hit it with Curl. Something on the server-side is eating the value when I hit it with Python.

Hmm… How long is the string you’re sending in the function call?

Just to check, was it working before and now it’s not? Have you changed firmware or your python script significantly? We haven’t made any significant changes to the function call endpoints I believe, so I’d want to start looking there first. Maybe try with something like Tinker and make sure you can call those functions, or maybe try watching the request with something like wireshark to make sure the parameters are coming through okay?

For reference some other python POSTing examples:

https://github.com/Alidron/spyrk/blob/master/spark_cloud.py#L59-L61

Thanks,
David

Just to chip in and give some inputs that function call from python is working for me.

In the other thread, i gave @Dave an example python code i used to call functions which uses the request module.

Spark Code:

int LED = D7;
int timing = 500;
int blinky(String command);
void setup() {
    Spark.function("blink", blinky);
    pinMode(LED,OUTPUT);
    Serial.begin(9600);
}
void loop() {
    digitalWrite(LED,HIGH);
    delay(timing);
    digitalWrite(LED,LOW);
    delay(timing);
}
int blinky(String command){
    Serial.println(command);
    if(command == "fast"){
        timing = 50;
        return 50;
    }
    else if (command == "slow"){
        timing = 500;
        return 500;
    }
    else if (command == "hello%20there")
    {
        timing = 200;
        return 200;
    }
    else
        return -1;
}

Just did a fresh test and things worked well. At least the basic function call is fine with simple args like slow, fast, hello there

Hope this helps!

It depends on the content-type specified in the headers. Check out this post from StackOverflow. I know it's not official W3C/IETF/RFC documentation, but I do think it's generally correct.

If you look at the code in the gist, i’m doing what @kennethlimcp is doing. At least it appears so.

How many arguments are you passing and are there spaces in between some to then like run faster?

I can try testing something as close as what you are doing.

Maybe the issue is having the SSE code like what @wgbartley reported that might be a problem

@huslage My first guess is that your args string is too long—63 chars is the max by default. Try sending something really short like “test” and let us know whether that works.

Thanks @zachary. It does appear to be something related to content length, actually. The args string:

Playa Guiones/! 2-3ft ** 10mph ENE/! 2-3ft ** 9mph ENE

is 55 characters long. Which is under the 64 character limit, but apparently I’m missing something there. If I send “blah” from Python, it works fine.

The 64 character limit seems rather arcane. Surely we could improve the buffer size somehow? Or is this limitation falsely imposed?

After it’s URL encoded, the length would go over 63. The limit is to conserve memory on the Core, since most common use case messages are, like, a single number or word, e.g. a hex RGB color code or a 0-255 number to set a servo to or something of that nature.

If you look around the community, you’ll see lots of complaints that we’re already using too much memory. :smile: Sixty-four bytes is our “works for most people most of the time” balance choice.

If you’re building locally, you can absolutely change the limit though, here, defined as USER_FUNC_ARG_LENGTH:

https://github.com/spark/core-firmware/blob/master/inc/spark_utilities.h#L50

2 Likes

Something else is going on here.

I set the USER_FUNC_ARG_LENGTH to 92 and the board just flashes cyan really fast when I send it the exact same data. Runscope indicates that the payload length is 59, so I shouldn’t be hitting any limits anyway.

I ran another test:

 curl -d access_token="<TOK>" -d args="123456789012345678901234567890123456789012345678901234567890123" -H "Accept-Encoding: gzip, deflate, compress" https://api.spark.io/v1/devices/<ID>/surf

This works.

curl -d access_token="<TOK>" -d args="1234567890123456789012345678901234567890123456789012345678901234" -H "Accept-Encoding: gzip, deflate, compress" https://api.spark.io/v1/devices/<ID>/surf

This doesn’t. I’ve flashed the new firmware with this set:

 $ grep USER_FUNC_ARG ../inc/spark_utilities.h
 #define USER_FUNC_ARG_LENGTH			92

So something else is governing my limit and causing the thing to crash at 54 characters. I’m not sure how to debug this further or why the limit is 10 characters shorter than the default…or why my new parameter isn’t taking.

Here is my edited code: https://github.com/huslage/core-firmware/tree/surfsoon

1 Like

Maybe @zachary can chime in here, but I would think that can’t just set the core’s length to be longer–you would have to set both the core and cloud’s length to be longer to get it to work.

Your POST is still going through the cloud which seems to be assuming your core has the old limit which is a good and safe practice to not overflow the core’s buffer.

Thanks @bko

How do i know what the valid upper limit is for my application?

All I’m doing is receiving some data and putting it onto an OLED screen, then every once in a while publishing to the cloud to trigger new data to be sent to the screen again.

OK, so you the limit is that 63 chars works and 64 doesn’t, even after you bump up USER_FUNC_ARG_LENGTH to 92. My bad. @bko I said above that this would work. I know the Cloud doesn’t care about the length, but the communication library might. I’ll double check…

1 Like

Yup, you have to bump up the max length in core-communication-lib/src/spark_protocol.h as well:

It’s checked here, when a function call is received:

1 Like

Great. That seems to have fixed it. I am at 82 chars now. Still not sure where the 10 missing are, but probably just URL Encoding junk.

Thanks everyone.

2 Likes