Unified libraries

Do we have anything we can use to add our :spark: instructions without having to modify the exisisting libs, something like

#ifdef SPARK_CORE
// spark core init
#else
// the previous code
#endif // SPARK_CORE

I believe it would be useful for a community which tries to bridge Arduino and Spark to have a common set of libraries capable of being ported back and forth between the two platforms… obviously only whenever this is possible due to licensing and implementation :smiley:

I believe this would be very nice for us all

4 Likes

Outstanding idea. We’ll do this ASAP. Thanks!

1 Like

Apparently @zachary understood you perfectly! I still have a question though:

Do you mean something like how the arduino libraries have version defines so they know which code to compile:

#if (ARDUINO >= 100)
 #include <Arduino.h>
#else
 #include <WProgram.h>
 #include <pins_arduino.h>
#endif

Because the spark core currently has USE_SPARK_CORE_V02 and USE_SPARK_CORE_V01 already.

#ifdef ARDUINO
  #if (ARDUINO >= 100)
   #include <Arduino.h>
  #else
   #include <WProgram.h>
   #include <pins_arduino.h>
  #endif
#elif USE_SPARK_CORE_V02
  #include <spark_related_stuff_v2.h>
#elif USE_SPARK_CORE_V01
  #include <spark_related_stuff_v1.h>
#endif

…or any other code you want to optionally define at compile time.

I would be surprised if Adafruit / Sparkfun currently allowed pull requests for Spark Core support in their libraries, but that doesn’t mean you can’t fork them and add support for the Spark Core while retaining support for the Arduino. It’s just a lot more work.

Good call @BDub — I forget that we have those in there. It’s on the backlog to remove the USE_SPARK_CORE_V01 definition and all code related to it. But then after it’s removed, only V02 is left and there’s no need for it.

So, two things

  1. Anyone can do this now using USE_SPARK_CORE_V02, and
  2. in the near future we’ll get rid of that and add a definition for SPARK_CORE, defined to an integer representing some notion of version, similarly to Arduino.
2 Likes

I thought we were trying to set up some sort of library repository were we would be able to port exisisting libraries to the :spark:, something like the Arduino libraries.
If that was the intention I believe it would be better if we could provide those libraries in a portable format (whenever possible), so that :spark: developers will be able to produce code without having to worry too much about the end platform. The idea might be an utopia, but for what I’ve seen so far most of the libraries should be able to adopt this convention quite easily.

I wasn’t aware of the existence of USE_SPARK_CORE_V0? definition, in fact I was asking if that was available already in my first post, so thanks a million for pointing that out. I believe point 2 from @zachary is going to be better long term solution, but I don’t feel it as urgent considering all the other stuff.

Regarding the library repository, I believe we need an official github repository (something like contrib-libraries) into which we can put all those libraries, may be with a name qualifier (compat?) to declare if the library is back-portable to Arduino or not… Just loud thinking here, I don’t have enough experience on C programming to know what are the difficulties here…

Some of the problems with just dropping in a Arduino library are handling all of the #defines, includes, macros and hardware specific stuff that arduino does with AVR.

The first two are easy enough… second gets harder when it involves things like port manipulation or PROGMEM access, and the last is the hardest because you are not going to want to detect hardware commands for the arduino and substitute STM32 commands. Most likely we’ll have to tranlate these blocks of code directly and wrap them with #ifdef ARDUINO vs. SPARK_CORE.

I do see some merit in leaving libraries backwards compatible with the arduino, but it would be nice if we could think of a good way to merge in changes from upstream repositories such as Adafruit/Sparkfun/etc. to make the arduino parts of the library easy to maintain (and not our problem when the break). Unfortunately my history with merging has been a constant hand editing fight, so I don’t look forward to doing it… which is why all of the libraries I’ve converted so far are meant to work on the spark core only (rightly or wrongly). It has been all about how can I get XYZ working as quickly as possible to give the community the tools to make projects with the spark core. Taking a step back, it would be worth solving some of the above questions before we convert too many more libraries :smile: I now have a lot of code to separate into .cpp .h files again as it is. I know I said this about two months ago as well… but time waits for no one!

the problem i have with the #ifdef SPARK_CORE is that its defined in application.h, so the only way (i can think of) of conditionally including application.h is:

#ifdef ARDUINO
#else
  #include "application.h"
#endif

which seems a bit mucky, i’d rather do:

#ifdef SPARK_CORE
  #include "application.h"
#endif

but that would involve moving #define SPARK_CORE (1) into main.h or something

HI @sej7278

This is changing soon! SPARK_CORE is going to become SPARK and the define is moving to the makefile:

That should work better.

2 Likes

ah lovely, thanks for the heads up, i thought there had to be a better way to do it