Spark.Function() Limits

Hi,

Currently the Spark.Function() call is limited to a maximum of 4 functions according to the documentation:

http://docs.spark.io/#/firmware

Is there any plan to increase this slightly?

Thanks

@Carsten4207 Thanks for the question! I can’t comment specifically on our roadmap for this request, since I’m not on the software team, and am not positive why it’s currently capped to 4 functions (I imagine it is due to an onboard memory/hardware constraint of some kind).

However, depending on what you are trying to accomplish, there are lots of ways to be creative with the number of “commands” you can pass through a single Spark.function. Since a Spark.function takes a string, you can parse the string so that, for an RC Car, for example, “forward”, “back”, “right”, and “left” passed as strings through the same Spark.function call a much larger number of functional responses all defined as separate functions in the Core firmware.

Does that make help or make sense?

Will @ Spark

Yeah makes since. When I was reading the documents I must have missed that I can pass in parameters.

It’ll solve my issues with my project. 4 functions is good for now!

Thanks
Carsten

This limitation is in place because of a limited amount of RAM. As we find ways to optimize RAM we’ll try to open up more Spark.functions. Thanks for the feedback @Carsten4207, and good luck with your project!

I just implemented this suggestion with great success.

// Holds the last command
char lastCommand[64] = "NONE";

void setup() {
    // Expose fnRouter() to the cloud
    Spark.function("fnRouter", fnRouter);
    
    // Expose lastCommand to the cloud
    Spark.variable("lastCommand", &lastCommand, STRING);
}

void loop() {
    // Do nothing
}

int fnRouter(String command) {
    // Trim extra spaces
    command.trim();

    // Convert it to upper-case for easier matching
    command.toUpperCase();

    // Copy command argument to lastCommand
    command.toCharArray(lastCommand, 64);

    // "Route" the commands to the corresponding function
    if(command.equals("DHTHUMIDITY"))
        return dht.readHumidity()*100;

    else if(command.equals("DHTTEMP") || command.equals("DHTTEMPF"))
        return dht.readTemperature(true)*100;

    else if(command.equals("DHTTEMPC"))
        return dht.readTemperature(false)*100;

    else if(command.equals("THERMTEMPK"))
        return (Thermistor.getTempK()*100);

    else if(command.equals("THERMTEMPC"))
        return (Thermistor.getTempC()*100);

    else if(command.equals("THERMTEMP") || command.equals("THERMTEMPF"))
        return (Thermistor.getTempF(true)-5)*100;

    else if(command.equals("ONETEMPC"))
        return oneTemp()*100;

    else if(command.equals("ONETEMP") || command.equals("ONETEMPF"))
        return ((oneTemp() * 9.0) / 5.0 + 32.0)*100;

    else if(command.equals("PHOTOCELL"))        
        return (Photocell.getLight(true)*100);

    else if(command.equals("PHOTOCELLRAW"))
        return (Photocell.getLightRaw(true));

    else if(command.equals("SECONDS"))
        return millis()/1000;

    else if(command.equals("MILLIS"))
        return millis();

    else
        return -1000;
}

5 Likes

Glad to hear it!

Filler text to reach character limit

I didn’t know Spark API functions were limited until I noticed my new functions weren’t showing up. Too bad.

Thanks for sharing your code, @wgbartley! I expanded upon it to support passing along parameters. Since I’m not fluent in C and C++, it literally took me a few hours to get it right :slight_smile: Especially the splitting up of the string. Jeez, in comparison with other languages where it just takes one line…

If anyone’s interested.

int func(String args);

////////////////////////////////////////////////

void setup() {
	Spark.function("func", func);
}

void loop() {
	//
}

////////////////////////////////////////////////

void splitArgStringToArray(String arguments, String *target){
Serial.println("[splitArgsStringToArray]");

int numArgs = 0;
int beginIdx = 0;
int idx = arguments.indexOf(";");

while (idx != -1) {
	String arg = arguments.substring(beginIdx, idx);
	arg.trim();
	target[numArgs] = arg;

	beginIdx = idx + 1;
	idx = arguments.indexOf(";", beginIdx);
	++numArgs;
}

// Single or last parameter
String lastArg = arguments.substring(beginIdx);
target[numArgs] = lastArg;
}

////////////////////////////////////////////////

int func(String args){
	Serial.print("[func] with ");
	Serial.println(args);

	/* */
	String data[5] = {NULL}; // Increase to support more parameters
	splitArgstringToArray(args, data);

	// Read all arguments out through the serial interface
	for(int i = 0; i < sizeof(data) / sizeof(String); i++) {
            if(data[i] != NULL) Serial.println(data[i]);
	}

	/* */
	// First parameters is the command
	String command = data[0];
	command.toUpperCase();

	Serial.print("Command received: ");
	Serial.println(command);
	Serial.println();

	if(command.equals("CHECK_DHT11")){
		return checkDHT11Sensor();
	} else if(command.equals("TEMPERATURE")){
		return readTemperatureSensor();
	} else if(command.equals("HUMIDITY")){
		return readHumiditySensor();
	} else if(command.equals("MOISTURE")){
		return readMoistureSensor();
	} else if(command.equals("SLEEP")){
		return sleep(data[1].toInt());
	} else if(command.equals("SLEEP_DEEP")){
		return deepSleep(data[1].toInt());
	}

	/* */
	return -42;
}

Improvements are more then welcome!

2 Likes

There is a more useful way. You can use bits to store more than one command at a time. And also their parameters.

Format may be like this;

|        commands        |               paramaters                      |
|     149 (1+4+16+128)   |"HIGH", ,"24", , "something", , , "FORWARD", , |

Full text : “149’HIGH’, ,‘24’, , ‘something’, , , ‘FORWARD’, ,”

149 DEC = 1+4+16+128 DEC = 101010010 BIN

Some pseudo code;

enum CommandEnum {
   TEMPERATURE = 1,
   HUMIDITY = 2,
   CHECK_DHT11 = 4,
   SLEEP_DEEP = 8,
   LEDOPEN = 16,
   MOTORSPEED = 32,
   ETC1 = 64,
   ETC2 = 128,
   ETC3 = 256
};

var cmdInt = command.substring(3); //not always 3. Check for ' charachter

if(cmdInt & CommandEnum.TEMPERATURE == CommandEnum.TEMPERATURE ){
   //Do something about Temprature with first parameter
} 
//EDIT: (no else)
if(cmdInt & CommandEnum.HUMIDITY == CommandEnum.HUMIDITY ){
   //Do something about HUMIDITY with second parameter
}
....

(It’s hard to write post with this editor. Sorry for complaining.)

Hope it helps. :smiley:
Hasan

4 Likes

I’ll second Quagh’s comment: I did not know there were limits until some of my functions didn’t show up after flashing. I spent a bit too long trying to debug this, as I was using this reference for info on Spark functions instead of this one.

I get the difference between the two sets of docs now, but it wasn’t immediately clear - As the docs get refined, it’d be helpful to have either 1) all the relevant info in both places or better yet 2) references between these two sections.

Hi @dan,

That does seem a bit confusing. I added a bug to add a link between that example and the spark.function docs. :slight_smile:

Thanks,
David

1 Like

Hi,
Does Spark.publish have the same limits? It seems my code works with 4.
Does that mean that a combined total of 4 publish / functions is the limit?
Thanks!

Spark.function is only limited because registering a function allocates some resources on the core in the event that function is called. Although publish is rate limited to about one per second for now, there are not similar uniqueness limits on Publish, you can have an infinite number of event names, channels, etc. :smile:

Thanks!
David

Ok, thanks!
I will check things over!
Dup

I’ve been looking for something like this, but since I couldn’t find one, I built my own…
I’ve created a Multi-Function Function that includes a command and 2 parameters.
Located here https://github.com/MisterNetwork/MultiSparkFunction

Command string format is: function, value, switch 0/1 or parameter
xxxxx,nnn,nnn
for example: NOTFY,000,1 (PushingBox Notification - ON)
DREAD,A20,0 (digitalRead pin A2)
DWRIT,D20,1 (digitalWrite pin D2 - High/On)
AWRIT,A20,255 (AnalogWrite pin A2 - 255)

Functions included:
DWRIT digitalWrite
DREAD digitalRead
AWRIT analogWrite
AREAD analogRead
NOTFY PushingBox Notification on/off
SLEEP Shuts off all processing

it also includes lots of debug serial.prints as I’m still learning the ins and outs of char/int/string syntax… etc.
:slight_smile:

1 Like

Hi all
I also spend a lot of time trying to work out why my fifth function was not responding before stumbling upon the documentation that stated only four functions were supported.

To save other people falling into the same trap, can I suggest an improvement to the Spark compiler (www.spark.io/build)?
Specifically: When a program containing 5 or more calls to “Spark.function()” is compiled, the compiler shows a message “Warning: The Spark Cores support 4 published functions. Your program contains more than 4 calls to Spark.function()”

This message would have saved me literally a day or more of debugging effort.

I have opened up an issue here just to be sure it’s discussed and looked into to reach a conclusion :wink:

1 Like

So @mdma gave an explanation and this warning might not be feasible in compilation code.

But I’m curious if the IDE can count for us! :smiley:

Any news on whether we will be allowed more functions in the Photon ?

Hi @Rockvole,

I think at this point, the only limit on how many functions / variables you can declare is ram, so I would think you could have many more on the Photon! :slight_smile:

Thanks,
David

2 Likes

Cool - it would be nice to be able to return strings from functions. Are there any plans for that ?