Sending a string, a big string to spark

Hi everyone!

When i try to send a big string (bigger than 63 characters) to spark over curl, the spark seems to reset and nothing happens. Is this a limitation of spark? Is there any work around?

I need to send the value of a html textfield, that can be big, I can probably check if the string is bigger than 63 and do multiple spark.functions() in the javascript side but I was wondering if I can handle this inside the spark code.

thanks in advance,
J

It is mentioned in the docs:

A Spark function is set up to take one argument of the String datatype. This argument length is limited to a max of 63 characters.

http://docs.spark.io/firmware/#spark-function

You might need other approach instead :wink:

There is not a limitation on String size on Spark other than it is easy run out of memory if you are not careful. I believe that the cloud trims any strings that are too long before sending them to your core. String operations are notorious for running you out of memory.

How big is your string? Does it fit in Spark memory?

I would try using two Spark functions, one for the data 63 chars at a time and a separate one that controls which 63 char chunk you are sending. This second Spark function could mean “increment to the next chunk” or you could have it take numerical argument (converted from a String) that is like a chunk address so you can go back to chuck zero easily.

1 Like