Last Endpoint in the setup function not created [SOLVED]

I’m adding several endpoint to the setup function and for some reason the last endpoint is not being registered.

Whatever endpoint that comes last is never found, so when I call the function I get this message back:

{
ok: false
error: "Function not found"
}

Does not matter which function it is, the last one is the one that is not accessible.

Here’s a piece of the code:

// Includes
#include "application.h"

// Function prototypes
int sparkDigitalRead(String pin);
int sparkDigitalWrite(String command);
int sparkAnalogRead(String pin);
int sparkAnalogWrite(String command);
int sparkServoWrite(String command);
int dummyFunc(String dummy);

void setup()
{
    Spark.function("servowrite", sparkServoWrite);
    Spark.function("digitalread", sparkDigitalRead);
    Spark.function("analogwrite", sparkAnalogWrite);
    Spark.function("analogread", sparkAnalogRead);
    Spark.function("digitalwrite", sparkDigitalWrite);
}
.
.
.

update1 :
This is what I get when I run:
https://api.spark.io/v1/devices/deviceid123456789

{
id: "53ff74065075535147141587",
name: "blacky",
connected: true,
variables: { },
functions: [
"servowrite",
"digitalread",
"analogwrite",
"analogread"
],
cc3000_patch_version: "1.28"
}

As you can see last function and endpoint not registered.

Update 2: Moors7 already pointed out a solution.

Well, that’s probably because it “only” supports 4 functions, as described in the docs here:
http://docs.spark.io/firmware/#data-and-control-spark-function

There are however alternatives to essentially get more functions through use of one Spark.function. @luz used that to create parameters for his MessageTorch. I combined that with some Jquery stuff from one of @bko tutorials to create an LED strip controller. Feel free to take a look at it. Although it’s far from perfect, it gives you an idea as to how these parameters might work. I’ve tried to somewhat comment it, so hopefully it’s somewhat usefull. You can find it here as a .rar file.

Good luck, and let us know if you need any more help. Until then, happy Sparking :wink:

3 Likes

You are right, closing this issue already fix the problem, Thanks for the heads up!

1 Like

For documentations sake: If you’re building locally you could of course edit the amount of Spark.fucntions to be available. This is currently limited to four because each of them takes up some precious memory. Because there are various alternatives you could use, it’s generally not neccesary to increase this number of functions. I just wanted to point out it’s possible if there’s a need for it.

I created a “function router” example to illustrate how you could combine multiple function calls into a single Spark.function().

3 Likes

Thanks a lot @wgbartley, this is pretty similar to what I did, but yours seems a bit more robust.

Since I only needed to differentiate between Two calls I ended up just passing and extra param in the string with the type of function.

Thanks for the heads up.