How to send hash character to Spark.function?

I can’t seem to send a hash (#) to a function defined on my Core. Even %23 comes through as nothing. The string is truncated immediately before the hash. Is there a nice workaround, or do I need to pass my own hash code and use String.replace()?

I remember other strange behaviour being reported here in the forum, in particular that a space " " becomes its HTTP representation “%20”. It sounds to me as if the issue you identify is part of the same problem, or a consequence to the fix of that problem. In any case, and pursuing my latest hobby horse, the treatment of this parameter could have easily been and should now be documented.

There is definitely some weird character interpretation going on. I tried out a few things this morning–maybe we can map it out and either get @Dave to explain it or put it in the doc.

You do have to be careful about escape codes at the source! Some things were clearly being changed for me by the shell I was using curl in. Other things like & are part of the POST protocol.

Hey All,

@zachary and I are aware of this problem. It has to do with our sending of function parameters as a url param in the function call request. CoAP doesn’t necessarily require you to url-escape url parameters, but our server library does anyway, so we’re debating the next best route. I’m guessing we will deploy a workaround for this during the next sprint, and look for a better solution long-term.

tl;dr; should be a fix for this next sprint. :smile:



The quickest fix is to document the permitted characters.

Heya @psb777,

Sure thing! This topic needs more research on our end, but I suspect the library will heroically url-encode all characters except the following: alphabetic, decimal digits, - _ . ! ~ * ' ( ). (from )


1 Like

@Dave - I imagined it had something to do with some automatic URL handling by Node.JS that is a pain to work around sometimes

Heck, I was trying to review some code from PHPJS’s url_decode() function to see all the sorts of URL character replacements it processed, but it simply relies on the built-in decodeURIComponent() JavaScript function.

As for documentation, I think documenting workarounds to the exceptions that we know about and the ones we discover may be easier for users to sift through instead of listing the hundreds of acceptable characters. And, before you say it should be documented either way, I think the hold is for the rewrite of the existing spark-cli code (currently in-progress),. It doesn’t make sense to spend hours writing documentation for something that will change almost entirely (except for some backwards compatibility exceptions) in the next few weeks.

Thanks @Dave !!

One of the particularly troublesome characters is + which gets url-encoded as %20, just like a space.

I’m glad to hear you guys are already thinking about this!

1 Like

Although the quickest fix was documenting :slight_smile: , I also just pushed a fix for this to our staging cloud, so expect it to be included in our next cloud update (hopefully early next week).