Best practice to pass multiple parameters to Spark.Function()

Dear all,

I want to pass from a html form multiple parameters to one spark cloud function.
Best would be to have always a tuple of values (e.g. parameter name & value) so that in later processing it you have a high flexibility to react on different number of parameters.

As I understood the only way is to pass one string to this function separated by a delimiter character.

Is there any best practice on how to parse this?
Is this even the right way to do it?

Depending on what you want to do, you could pass some integers which each represent a certain function. So you could send 345, go through some if statements ( if > 300 do this, - 300 etc) and execute your function.

Or did I misunderstand you? :wink:

I have a feeling he’s looking more into passing multiple parameters at once. Like for example color values, which could be “red:10,green:75,blue:255”. There are some examples of that floating around the forum, one of them being the FireTorch. Personally, I’ve put the arguments in a JSON style, since that’s relatively easy to parse on most platforms/languages.

1 Like

Thanks for your answers.
Indded I am looking into something like @Moors7 descibed. Exactly about passing RGB values, but also additional information like which led strip to control, what kind of action the LED strip should perform, on/off signals etc.

I have a few sketches doing what you want... I'm sending multiple 'commands' to spark. I created a Class to decode String messages. Here is an example:

In Header:

class httpCommand {
  String argument;
public:
  void extractValues(String);
  String mssgCommand (void) {
    return argument.substring(argument.indexOf("command#") + 8, argument.indexOf("#text="));
  }
  String mssgText (void) {
    return argument.substring(argument.indexOf("#text=") + 6, argument.indexOf("#value0="));
  }
  int mssgValue0 (void) {
    return (argument.substring(argument.indexOf("#value0=") + 8, argument.indexOf("#value1="))).toInt();
  }
  int mssgValue1 (void) {
    return (argument.substring(argument.indexOf("#value1=") + 8, argument.indexOf("?"))).toInt();
  }
};
//
void httpCommand::extractValues (String stringPassed){
  argument = stringPassed;
}

In Setup:

Spark.function("httpRequest", httpRequest);

It is looking for me to send to Spark:

params=command#MY_COMMAND#text=MY_TEXT#value0=MY_INT#value1=MY2ND_INT?

it is certainly extensible...

I sort incoming commands like this:

Spark.function( ) function:

int httpRequest(String mssgArgs)
{
  DEBUG_PRINTLN("command recieved...");
  oldData = false;
  lastDataTransmitTime = millis();
  httpCommand command;
  command.extractValues(mssgArgs);
  DEBUG_PRINTLN(command.mssgCommand());
  boolean badMessage = true;
  for (int i = 0; i < NUMBER_OF_MESSAGE_TYPES; i++)
  {
    if (command.mssgCommand().equals(messageType[i]))
    {
      DEBUG_PRINTLN("Valid Message Recieved...");
      DEBUG_PRINTLN("Message type:");
      DEBUG_PRINTLN(command.mssgCommand());
      DEBUG_PRINT("messgText = ");
      DEBUG_PRINTLN(command.mssgText());
      DEBUG_PRINT("messgValue0 = ");
      DEBUG_PRINTLN(command.mssgValue0());
      DEBUG_PRINT("messgValue1 = ");
      DEBUG_PRINTLN(command.mssgValue1());
      updateVariables(i, command.mssgText(), command.mssgValue0(), command.mssgValue1()); // Pass decoded variables to function
      badMessage = false;
    }
  }
  if (badMessage)
  {
    DEBUG_PRINTLN("non-conforming message attempt...");
    return -1;
  }
  else return 1;
}

against an array of commands like this:

in the header:

const String messageType[NUMBER_OF_MESSAGE_TYPES] = {
  "ledStatus", "alarmState", "garageState", "guestGarageState", "weatherCondition", \
  "outsideTemp", "outsideHumid", "airconSetpoint", "weatherForecast", "messageData", \
  "todayHigh", "todayLow", "windSpeed", "windDirection", \
  "relayState", "brightLevel", "emailCount", "resetSpark", "resetHour" };
1 Like

Hi @BulldogLowell,

I think this is a interesting solution. I haven’t had yet time to test this but do I understand correctly that you even don’t use the Spark Cloud but sending the http request directly to the Spark?

no, I am using Spark Cloud with a poorly named function :frowning: that was a carryover from the same program I wrote for my ethernet connected Arduino.

Sorry to confuse you... lazy on my part.