Expose custom C++ class from core-firmware

I am trying to build a custom class in C++ to be used by multiple sketches. The C++ sources have been built with core-firmware and flashed successfully.

However, when I try to use the custom class, in the Spark web IDE I get compilation error such as:

error: ‘CustomClass’ was not declared in this scope

It looks like the custom class, although compiled successfully, has not been exposed and therefore cannot be used by user-level sketch yet.

I have been following existing examples such as the SparkClass in spark_utilities.h and spark_utilities.cpp, and included the header file of the custom class in application.h in core-firmware/inc/. But so far these symbols are not accessible by user-level code in Spark web IDE.

Am I on the right path or should I try a different approach?

Again, if anyone has any tips on how to develop C/C++ library for the Spark, I thank you for any info you might send my way.

Cheers!

With web IDE, you could include the content of .h on top of application (before setup and loop) and content of .cpp somewhere after that. Multifile support for IDE should be coming soon.

Hi @mhe,

The pre-processor does a few things which might interfere with putting classes or defining other new types in the same source file. What it wants to do is provide function declarations for you when they’re missing, but it’s not smart enough to put them after your class. There are two easy ways you can work around this, before multiple-file support is rolled out:

Option #1.) if you have functions that use your class, make sure you provide your own function declarations:

error:

class MyNewClass { ... }
void doCoolThings(MyNewClass foo) { ... }

great success:

class MyNewClass { ... }
void doCoolThings(MyNewClass foo);
void doCoolThings(MyNewClass foo) { ... }

Option #2.) We should be adding multiple-file support this week or early next week, which will make this easy and behave more like you’d expect.

Option #3.) You can disable the pre-processor entirely, but then you need to provide all your own function declarations and stuff. You can do this by adding this line to your source file exactly:

#pragma SPARK_NO_PREPROCESSOR

Hope that helps!

Thanks!
David

Hi @Dave and @weakset,

Thanks for your suggestions. My goal is to write most (reusable) code in C/C++ while keeping the sketch file minimal and without the lower-level implementation details. So essentially I am trying to mix Arduino/Wiring with C/C++. I am not sure if the upcoming multi-file support in Spark IDE will cover that, because the multiple files, in this case, are not written in Arduino/Wiring.

Is there a way to make #includeCustomHeader.h” (a C/C++ header file) working in Spark IDE? If possible, can we know how the Spark or Serial object is exposed to user-level code so that we can use Spark.xxxx() and Serial.xxxx() in the sketches?

Thanks!

Hi @mhe,

Totally! I think the plan is to roll out multiple file support very soon, it’s just going through some final testing. You can definitely do #include#MyAwesomeLib.h” type stuff with it in the IDE.

The serial stuff is here and here:
https://github.com/spark/core-firmware/blob/master/inc/spark_wiring_usbserial.h and https://github.com/spark/core-firmware/blob/master/src/spark_wiring_usbserial.cpp

The Spark. stuff is here: https://github.com/spark/core-firmware/blob/master/src/spark_utilities.cpp#L135 and here https://github.com/spark/core-firmware/blob/master/inc/spark_utilities.h