How to point Spark.function to a class' function?

Heya dear community!

To keep an upcoming library as structured and magic as possible, I would love to setup a Spark.function accessor from inside a class, pointing it to one of the class member functions.

And ran into an issue with function pointers which I would love to ask you C++ ninjas to help out with :smile:

Demo code:

class foo {
    int call(String foobar) {}
    void begin() {
      Spark.function("call", call);
} bar;

void setup() {
void loop() {}

Which throws following error:

../inc/spark_utilities.h:107:14: note: static void SparkClass::function(const char*, int (*)(String))
static void function(const char *funcKey, int (*pFunc)(String paramString));
../inc/spark_utilities.h:107:14: note:   no known conversion for argument 2 from '<unresolved overloaded function type>' to 'int (*)(String)'
make: *** [test.o] Error 1

Anyone knows how to make an unresolved overloaded function type compatible with a function pointer int (*)(String)?

Is it generally possible to pass function pointers to class member functions, if the receiver isn’t aware of the class the pointed function is defined in? Example: SparkClass::function(…, int (void::*pFunc))(String))?

Thanks and love! :heart:

So a function handle to the method “call” useless without an object (an instance of the class) and at call time you don’t have an object yet. You could use a static method since that does not depend on an instance of the class.

See this C++ FAQ for a technique to wrap the method that does work:

1 Like

Thanks for the advice!!

Mkaaay, so while there is a way for static methods (-> only static variables), it would still require a wrapper function for non-static instances.

Lesson learned :smile:

I have wanted this feature too. Typically the way it’s solved is that registration functions like attachInterrupt() and Spark.function() take at least 2 arguments - a function pointer, and a void pointer, like this:

class MyClass {
   static int callback(String arg, void* pv) {
       return ((MyClass*)pv)->handle(arg);

   virtual int handle(String arg) {
       return 42;

MyClass my_class;

Spark.function("myfunc", MyClass;:callback, &my_class);

You see the Spark.function registration takes an extra void* argument, which is data that is passed back to the registered function (in addition to the usual parameters.)

This data allows the function call to be further paramterized, such as which object we want to invoke.

I filed an issue to keep track of this.