Spark.variable() STRINGS larger than 9 chars issue [SOLVED]

I would have to look this up, but I think long-ish payloads might require an extra CoAP packet, not 100% sure – I’m testing now to see how long the strings can be – it looks like the limit is between 230 and 240

2 Likes

My guess was purely based on 640 queue I saw somewhere, but even 160 is longer than a tweet and with neatly formatted data its quite long.

1 Like

Awexome guys! I looked at the code… not understanding how the queue[] works just by the DIFF, but great job @weakset :slight_smile:

2 Likes

The ~240 limit does make sense, because I realized the queue is an array of unsigned chars. So even if the queue could contain a longer string, the size indicator in can’t be more than 255 (and queue has some headers, like the size indicator itself.)

I made calculations and limit should be 239.
(239& ~15) + 16 = 240 (still room for headers).
(240& ~15) + 16 = 256 (already over 255 without headers).

I missed a spot. There was that 6 char header, so limit is 233. This should truncate it so API won’t time out anymore.

2 Likes

Did the support for “Strings as variables greater than 9 chars” ever get resolved… I’m still seeing weird String values for variables returned from core…

I’d love to be able to return JSON strings even if they are restricted to just over 200chars.

1 Like

Hi @Dom,

Yes, thanks to some awesome pull requests!

char *bigString = "This is a really really really really really really really really really really really really really really really really really really really really really really really really really really really really long string";

void setup() {
    Spark.variable("bigString", bigString, STRING);
}

Just to be clear: the variable must be char[], not String object.

1 Like

Topic updated appended: [SOLVED]

Just a question… shouldn’t it be possible to make it work with String objects as well? That would make Spark.variable() of type STRING make more logical sense and more universal.

1 Like

Thanks for updating the topic!

Yes, I agree, I have a bug in the queue for making it work with String objects as well, since that definitely tripped me up at first when I tried it. :confused:

1 Like

Few hours ago @zachary made an update to master that now supports strings up to full queue length. So “final” limit seems to be now 622 characters and won’t timeout if you try longer (it just cuts the string at limit).

The first byte of queue was also part of size variable instead of sole second one that I had figured out in my version.

1 Like

Is it still working? With this code:

char* ledStat = "hello there!";
void setup() {
  Spark.variable("ledStatus", &ledStat, STRING);
}

I get:

{
  "cmd": "VarReturn",
  "name": "ledStatus",
  "result": "\u0016�\u0001\b\b",
  "coreInfo": {
    "last_app": "",
    "last_heard": "2014-03-07T15:04:24.068Z",
    "connected": true,
    "deviceID": "xxx"
  }
}

If that is expected, how do I parse the string via JavaScript?

Ah, I think the problem is that ledStat is already a pointer, try it without the “&”

That worked. Thank you @Dave! :smile:

1 Like

Why wouldn’t my published variable change if it is changing in the application?

I have a bit of code (abridged)

char *stsMessage = "12312341234123412341234123412341234123412341234123412341234";//[40] = "";

/* This function is called once at start up ----------------------------------*/

void setup()
{
  //Expose variables to the cloud
  Spark.variable("strReceived", inSerial, STRING);
  Spark.variable("statResp", stsMessage, STRING);
......
}

In my application, I say

int DisplayMessage (void){
                            Serial.println("begintodisplaymessage");
....
                    switch (stsArVal[1]) {
                        case 1:
                          //do something when var equals 1
                          stsMessage= "1= WAITING TO SAMPLE";
                          break;
...
}

When I print to serial, the variable is updated. When I go to another function and print the same variable to serial (which should make any scope problems a non-issue I think), I also get the proper response.

You see, stsMessage is getting overwritten with new information. This is properly conveyed to the serial port but not in the cloud.
Incidentally, strReceived is getting properly updated. Is this an issue with using char* ?

When I call this variable from the cloud, it is not updated.

Hi @ruben,

I think you’re running into a fun C quirk with pointers, yay pointers! :slight_smile:

When you’re setting up a Spark.variable, you’re saying you have something interesting at a particular address in memory, in this case the address of “stsMessage” during setup. When you assign it a new value in the loop, instead of copying your new information into that same spot in memory, you’re pointing stsMessage somewhere else, in this case to the address of “WAITING TO SAMPLE”. I think if you used strcpy, or something similar, and initialized stsMessage to a larger array for strings, you’d be golden, e.g.:

char stsMessage[64];
void setup() {
    Spark.variable("statResp", stsMessage, STRING);
}
void loop() {
   strcpy(stsMessage, "Hey hey");
}

I hope that helps! :slight_smile:

Thanks,
David

1 Like

that is indeed what i ended up doing. I'll post the final version to github and update w the link for posterity.
thank you.

1 Like

Hi!
I believe I have the same issue. If I understand correctly, I should copy using “strcpy” before I sprintf, correct? Would the same thing apply to spark.publish?
Also, does the “sts” do something such as replace the “&”?
Thanks!

Hi @Dup,

The “sts” in this case is just his naming convention, not needed If you’re using sprintf, then sprintf is printing into that character array, and you don’t need to do a strcpy. :smile:

This also doesn’t apply to Spark.publish calls, since it just pulls whatever is there at the moment you call that command, instead of referring to an address you pointed to earlier and then changed.

Thanks,
David

1 Like

thanks!

My code seems to be working and the led is breathing cyan but it seems to lose connectivity to the cloud.

I will keep troubleshooting!

Dup

1 Like