Passing an object to a function => Compiler error? [SOLVED]

I think the following code should compile

class MyClass
{
    public:
        int var;
};

void MyFunction(MyClass *p)
{    
}

void setup() {
}

void loop() {
}

Yet get a compile error

objectparam.cpp:4:17: error: variable or field 'MyFunction' declared void
  void MyFunction(MyClass *p);
  ^
 objectparam.cpp:4:17: error: 'MyClass' was not declared in this scope
 objectparam.cpp:4:26: error: 'p' was not declared in this scope

Any ideas how to fix this?

Thanks
Mark


I’ve edited your post to properly format the code. Please check out this post, so you know how to do this yourself in the future. Thanks in advance! ~Jordy

Hi @beharrell

The Spark preprocessor that lets you avoid certain requirements in C/C++ does not like classes in a sketch file. You can move your header and class to .h and .cpp files or you can turn the preprocessor off with this:

#pragma SPARK_NO_PREPROCESSOR
#include "application.h"

Since the preprocessor handles function prototypes and forward references for you, you might have some code reordering or other changes to make as well.

2 Likes

Thanks for the response.
Tried both suggested methods

the SPARK_NO_PREPROCESSOR method works for me

but moving the class into a separate header file fails
i.e.
The header file (TheClass.h)

#ifndef THECLASS_H
//#include "application.h"

class MyClass
{
    public:
        int var;
};

#endif

the ino file

 // This #include statement was automatically added by the Spark IDE.
 #include "TheClass.h"

void MyFunction(MyClass *p){
}

void setup() {
}

void loop() {
}

Still gives the same error.

Solved by placing function declaration into header file i.e.

#ifndef THECLASS_H

class MyClass
{
    public:
        int var;
};


void MyFunction(MyClass *p)
{
    
}
#endif

Thanks for the help
Mark