Error: invalid conversion from 'int (*)()' to 'int (*)(String)' [-fpermissive]

####I am not sure what is going on and why I am getting this error on such a simple function.
####Here is the code:

int testFunction(){
    digitalWrite(D3, HIGH);
    delay(500);
    digitalWrite(D3,LOW);
    return 1;
}

void setup(){
    pinMode(D3,OUTPUT);
    Spark.function("tests", testFunction);
}

####Here is the entire stack trace:

In file included from ../inc/spark_wiring.h:29:0,
from ../inc/application.h:29,
from the_user_app.cpp:2:
../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning "Defaulting to Release Build" [-Wcpp]
\#warning "Defaulting to Release Build"
^
the_user_app.cpp: In function 'void setup()':
the_user_app.cpp:10:38: error: invalid conversion from 'int (*)()' to 'int (*)(String)' [-fpermissive]
return 1;
^
In file included from ../inc/spark_wiring.h:33:0,
from ../inc/application.h:29,
from the_user_app.cpp:2:
../inc/spark_utilities.h:109:14: error: initializing argument 2 of 'static void SparkClass::function(const char*, int (*)(String))' [-fpermissive]
static void function(const char *funcKey, int (*pFunc)(String paramString));
^
make: *** [the_user_app.o] Error 1

Ready.

####Also a suggestion: to make the traceback less verbose.

Your function should take a String parameter like this

int testFunction(String dmy)
{
  ...
}


Actually your topic title didn’t reflect the actual error message, which made it even more obscure.
So I’ve “corrected” this, and then the error does actually tell you what was wrong

invalid conversion from 'int (*)()' to 'int (*)(String)'

This tells you clearly, that you try to convert a function pointer ( (*)(...) ) to a function that returns an int and doesn’t take any parameters ( () ) into a function pointer ( (*)(...) ) to a function that returns an int but expects a ( (String) ) as argument.

In code

// you do
int testFunction();
// instead of 
int testFunction(String dmy);

The forum search for [-fpermissive] came up with similar questions (e.g. here, here, here and others)

2 Likes

Do Spark Functions have to take a parameter of type String? If so is using the parameter optional?

Yes, the functions must accept a String parameter, but you don’t need to use it and you don’t even have to provide it (a NULL pointer would be accepted too).

1 Like

That's a good suggestion! The output is provided by the compiler, and for cases like this the real nub of the problem is buried in many lines of error messages. We could parse and rework the compiler output to make it more friendly, in particular, we could catch common cases like this, and provide a clearer message with a link to a help page:

the function passed to Spark.function() should be like 'int myfunc(String)'.

Cloud API docs: Spark.function()

cc: @suda

Yes, a clearer error message along with a relevant link would save time as well as be more user friendly.