Spark.Publish max length (63 characters)

Hi,

I am wondering what the max length of a spark.publish is. The docs say optional 63 bytes of data, but in the example bko posted he uses a char array of length 40 so I am confused. What is the maximum length of the char publishstring[]?

Hi @TheHawk1337

The limit is 63 real characters (64 total string length since a C string is NULL terminated). I just used 40 because I counted and I knew what I was sending would fit.

[edit] Publish actually truncates at 63 characters.

3 Likes

Ah perfect, thanks! :smile:

2 Likes

If you need to send longer messages, you can do it. You’ll have to build the Spark firmware locally (see https://github.com/spark/core-firmware/ for instructions).
Then in Spark_firmware_source/core_communications_lib/src/events.cpp, the event() function, you can change the maximum length. The lines the copy the data are:

  if (NULL != data)
  {
    name_data_len = strnlen(data, 63);

    *p++ = 0xff;
    memcpy(p, data, name_data_len);
    p += name_data_len;
  }

The 63 above is the limit for data. This is eventually copied to an internal buffer, queue[640], so you can’t use 640 because of overhead. I’m running with

name_data_len = strnlen(data, 255);

without problems.

— Scott

Scott, I made the change to events.cpp that you mentioned.
Did a Make.
Flashed.

But I’m still getting just 63.

Because you can publish up to a burst of 4 events as long as you keep the overall rate at 1 per second, you can really send 63*4 = 252 characters fairly quickly with no modifications. So that is another simple thing you can try.

You do have to always compile locally and load code of USB (either dfu-util or the Spark CLI wrapper for it) to make the above change–the web IDE won’t know about the changes to events.cpp.

Finally maybe @Dave can comment on the cloud side–I did not think there was a 63 char limit in the cloud software. If there is, then running a local cloud might help.

Heya Gang,

The cloud doesn’t really care if your published event is longer than that limit within reason.

The current limit has more to do with the formation of the coap messages coming from the communications library. We had to make a similar expansion a while back for the Spark.variables when we found that the initial limit was too small. We’ve been chatting about potentially expanding the max size of a published event topic and its contents, but the firmware team has been focusing on a hardware abstraction layer recently. I’m sure pull requests would be welcome in the meantime! :slight_smile:

Thanks,
David

1 Like

Thanks folks. @bko I did compile locally and load the code with dfu-util so the limit must be in the cloud side. I’ll have to think about the workarounds. Not sure they are going to work for us at scale.

Heya @dloop,

The limit isn’t on the cloud side. :slight_smile: It’s not about the memcpy, it’s about the size of the coap message.

Thanks,
David

1 Like

Thanks @Dave for clarifying that… I’m new to much of this, so please take my questions/comments with that in mind…

I don’t believe COAP has a 63 or 64 byte limit. So is 63 just what Spark has decided to implement at this stage, but it could be increased when there are resources available to work on this later perhaps?

Heya!

The limit just has to do with the length of the coap packet, it was easy to implement initially because it fit in one packet :slight_smile: … We’re talking about if / when we should increase it, but that might be a little down the road. If someone felt like patching it in the meantime, that’s always cool too. :slight_smile:

Thanks!
David

Recently for one of my weekend project I thought of increasing the number. It seems that we can increase up to 320 characters, after that it will be truncated.

Hi.

I have created an webhook which send xml datas to login.salesforce.com. But Spark.publish command prevent me to send more than 63 character. And i send datas in 1 second interval. I opened my dashboard for viewing to sending datas. But hook-response not delivered here. My json file is below.

{
    "eventName": "rest_login",
    "url": "login.salesforce.com/services/Soap/u/33.0",
    "requestType": "POST",
	"headers": {
		"Content-Type": "text/xml; charset=UTF-8",
		"SOAPAction": "login",
		"Accept": "text/xml"
	}	
}

Salesforce login process is using a SOAP action. And i prefer publish datas to cloud with this method.

const String Xml_Mark[] = {
  "<?xml version=\"1.0\" encoding=\"utf-8\" ?>",
  "<env:Envelope xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\"",
  "xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"",
  "xmlns:env=\"http://schemas.xmlsoap.org/soap/envelope/\">",
  "<env:Body><n1:login xmlns:n1=\"urn:partner.soap.sforce.com\">"
    "<n1:username>username</n1:username>",
  "<n1:password>password</n1:password>",
  "</n1:login></env:Body></env:Envelope>"};
int size=0;

void setup()
{
  Serial.begin(9600);
}

void loop()
{   
  while (size<8){
    Spark.publish("rest_login",Xml_Mark[size]);
    size++;
    delay(1000);
  }
  size=0;
  delay(5000);
}

Please help me.Thanks.

@emrebekar67, can you share your webhook code so we can help you better. The way you have it now, the webhook should fire 8 times with each publish!

What “data” do you actually need to pass to salesforce versus a pre-formated body?

I have sended it @peekay123. I used xml markup in Xml_Mark[] array. Size of this array is 200 char. And i published my webhook code in my question.

@emrebekar67, I’ll defer this to @bko or @Moors7 because they may have a better understanding of what you are trying to do.

Hi @emrebekar67

You make Xml_Mark as a one dimension C char array with many bytes (200-ish you say), but then you index into it with 0 to 7 which only gets one character per iteration.

I think you meant to declare Xml_Mark as a two dimensional C array with a first dimension of 8 and a second dimension that fits your longest string or a max of 64.

2 Likes

No. I would like to send XML tags to salesforce.com at once. It means, no array and send block of tags. But because of the 64 char limit of data, i can’t send data at once.

And i want to get response, but when i sended data, no hook-response returned.

:blush: Just realized you're not using const char[] but const String[] :blush:

But if you're only using const String[], I'd recommend to go for const char[] as shown below anyhow.


Original post:

@emrebekar67, I think @bko meant, you should declare your XML array differently :wink:

More like this

The way you're doing it you'd publish this

<?xml version="1.0" encoding="utf-8" ?>
?xml version="1.0" encoding="utf-8" ?>
xml version="1.0" encoding="utf-8" ?>
ml version="1.0" encoding="utf-8" ?>
l version="1.0" encoding="utf-8" ?>
 version="1.0" encoding="utf-8" ?>
version="1.0" encoding="utf-8" ?>
ersion="1.0" encoding="utf-8" ?>

I don't think this is what you intended.

And I'd rename size into something more meaningful for the way you're using it - like tagNr :wink:

1 Like