Cloud Function Timing Out

Hello! I’m trying to call a basic cloud function I added to my INO file. For simplicity’s sake, I’ve commented everything else out in the code except the call to Spark.function() and the function itself, which simply returns 1. Following the exact curl in the examples, I continuously get:

{ "ok": false, "error": "Timed out." }

I’m down to the most basic example I can get. Any ideas on where I should start debugging? I’m running curl https://api.spark.io/v1/devices/<my device ID>/<my function name> -d access_token=<my access token> -d "args=1".

Where did you add the Spark.function() to setup()?

Posting the code you are flashing to your core would be of great help. Would you minding doing so when you reply?

One other thing that comes up a lot is that Spark.function names are limited to 12-characters and get truncated so that when you use curl to call the longer name you think you have, you see this error. You can use a GET request to find the actual name or just check your code.

3 Likes

Ah - it turns out setBrightness is 13 characters, which is what did it.

A few recommendations:

  • You mention the 12 character limit on the Core Code (Firmware) page, but not on Cloud Code (API), which is where people will be when they’re actually trying to call a function. It would be great to get a mention of the limit added to the CONTROLLING A CORE section.
  • From a technical perspective, a 10 second timeout when a method name is not found is a relatively bad developer experience. You’re much more familiar with your systems than I am, but having the Core return a “Not Found” error (or maintaining a server side index of functions that are enabled, since the Core in question must be on the Internet to receive a function request in the first place, and catching it at the API layer) would be extremely helpful.

Thanks for your help!

Glad it is working for you now.

Your other remarks are best directed to Spark staff – perhaps @Dave can capture this request for the cloud to improve.

1 Like

Hey @mvd7793, you can always do a GET to https://api.spark.io/v1/devices/DEVICE_ID/?access_token=ACCESS_TOKEN to get a list of active function on a core.

I do agree however, a 10 second timeout is a little long. It would also be nice if the WEB IDE checked spark.function calls for names longer than 12 characters.

@mvd7793, I have added a pull request for the spark docs (https://github.com/spark/docs/pull/250). Would that additional have helped you?

1 Like