I have a bunch of classes that provide Spark.functions in their constructors. Up until now, I’ve been instantiating instances of these classes in setup() and throwing them in a
std::deque that I’ve defined at the top of my .ino and it has been working as expected. Namely, all of those Spark.functions show up when I run
particle list from the command line.
Now, I’d like to be able to use another Spark.function to dynamically add those items to the deque during runtime. I have a function all written up and it seems to be managing the objects correctly, but I’m no longer getting those Spark.functions for the objects getting put in the deque.
Do all Spark.functions need to be defined in setup() (i.e. I just need to suck it up and build a router function), or am I missing something that would make this work?
Related question: If it’s the latter, is there a way to remove Spark.functions as well? I would have thought they’d vanish when the object destructs, but I also expected these Spark.functions to register dynamically, so… ¯_(ツ)_/¯