Library import flattens directory structure? [WORKAROUND]

Because of FastLED’s complexity and the number of platforms it supports, I have split out all of the platform specific files into subdirectories. Unfortunately, it appears that when I import the library into the spark IDE it flattens all the files and directory hierarchy, so an include statement like:

#include "platforms/arm/stm32/led_sysdefs_arm_stm32.h"

results in errors about files not found. This is from trying to import

I have been able to build/test the library locally, but it’s a little opaque as far as what is happening on the IDE side with the library. Can the sparkcore IDE not handle libraries with a directory structure?

Temporarily worked around - it’s not a long term solution for the library as right now I’m keeping a completely separate tree from the main library and will have to pull over code/features manually, but it sounded like from another thread the library side of the world is going to be changing up soon.

I’m not sure if this would work but usually you need to add the library name as “home” directory of your library.

e.g. like this

#include "FastLED/platforms/arm/stm32/led_sysdefs_arm_stm32.h"

This is definetly required for libraries that live in a flat path, but it might also help with your hierarchy.

(N.B. the include statements I was referring to as not working were include statements inside the library, not include statements being made in applications)

That was one of the first things that I tried, but that didn’t work. I’ve temporarily moved all the stm32 specific files up into the root directory of the library. However, so far, this is the only platform/build environment that I work with that breaks the library layout/include structure that I’ve got. (And the number of platforms that I’m supporting is only going to go up. I’m currently at a couple of AVR variants and 5 ARM variants and I’ve got 2-3 more ARM variants waiting in the wings next, as well as a couple of non-ARM platforms as well - and that’s not including adding more platform specific features like parallelized outputs, DMA driven outputs, etc…).

So far, for the most part it looks like people doing one shot ports of libraries for SparkCore, and the IDE library setup seems just fine for that. FastLED is an active, multi platform/architecture library, with a lot of ongoing development/expansion going on (with a pretty aggressive roadmap for dev over the next year), having to continually manually massage the structure/layout of the library every time I want to roll out to spark core/photon is a fair bit of friction (doubly so considering that the layout of the library for using the CLI tools to upload seems to not be consistent with what the IDE does).

There’s a handful of things, as someone who does work on a library that frequently involves multiple rounds of uploads/testing (whether it’s testing the variety of supported led chipsets, or the various types of output possible, etc…) that I’d like to see that would make working with the Spark environment smoother:

  • More parity between the functionality/layout of the CLI based development and the IDE based development. The web based/remote upload through the IDE, while convenient is a very very long cycle (especially for rounds of testing that consist of upload, run a 5 second test w/a logic probe grabbing data, tweak, and repeat - when it’s 30-60s to upload, I’m spending all my time waiting).
  • More parity with how other build environments (including hand grown make environments) reference libraries. In make environments, it is enough for me to add -I/Path/to/FastLED for everything to work. Arduino effectively does this behind the scenes.
  • Support for hierarchy in a library, and an ability to see what layout of the library Spark Core is seeing.

ETA: Dumping all this stuff out while it is fresh in my head having just come off of a 48 hour push porting FastLED to two new platforms. I’m hoping to continue to support the spark core/photon platforms - it’ll be good having an answer for when people are looking for connected MCUs where they need a bit more oomph than bluetooth gives them.

I agree!
The library support on the Web IDE could do with some improvements as you outlined here.
Maybe this would be a thing @Dave could chime in, since I think he’s the one who has got Web IDE under his wings.

But as for the comparison of the Arduino IDE library support and the Web IDE, these two are a complete different kettle of fish.
While the Arduino IDE (and other build environments) is completely local and under your own “home control” the Spark’s Web IDE has to be managed centrally without actually knowing what possible cross references between individual libraries and applications might be produced or expected by the users, so as a starting point some restrictions may be imposed to make things work first - but can be dropped/loosened as the project progresses.

But in general - as already stated - I’m with you on the point that in the long run regarding library support there should be as good as no difference between the Web IDE and local building.

And for the time being it would be great if there was a #define WEB_IDE_BUILD or so, which would allow us to have one code base for building locally or via the Web IDE.

This way we could make do with something like

#if defined(WEB_IDE_BUILD)
#include "MyLib/MyLib.h"
#include <MyLib.h>

Hey All,

We do define SPARK when you build in the web IDE / CLI I believe, so that could help with setting up the preprocessor. I’ll ping @jgoggins, and @suda since they’ve been working on the IDE and Spark Dev more than myself. :slight_smile:


1 Like

Thanks @Dave for responding an relaying my ping :wink:

To safe Wojtek and Joe some reading:
It would be nice not only to distinguish between SPARK or not but also between WEB_IDE and Spar Dev/CLI, since Web IDE requires a different folder hierarchy for #include than the others.

Hi @ScruffR,

yeah this is a known issue. But instead of adding workarounds (like #defining env) it will be better for all developer facing tools to have the same way of importing. It is something we have to figure out soon.

As for workaround, SPARK is defined in both Build and Dev so this wouldn’t really help.

I know :anguished:

But since @Dave mentioned SPARK in his post, I tried to incorporate this and point out that there would still be a need for something else.
Sure, best would be one behaviour over all dev envs.

@ScruffR @Dave @mdma

Any update on this? Do I absolutely need to flatten my directory structure in-order to support for WebIDE? I have an intricate directory structure within my project (and this easily builds when I make with and program using dfu-utils).

Do I have to flatten this (it’s going to be a lot of work, and at the same time I will lose build capability for another platform)? Are there any build flags that I can use with #ifdefs for Web IDE support?

Any help is sincerely appreciated.

1 Like

Don’t put your finger in open wounds :weary:
I’ve been proposing this idea for ages but always got staved off that there will be a one-for-all IDE solution, but I’m still waiting.
If it were introduced shortly after I first mentioned it, it would have saved a lot of people some work and grief.

If you look in this forum how often people stumble over incompatible #include statements in libraries that either trip up Particle Build or Particle Dev :tired_face:

1 Like

I hope there is a way around this, this is very important to me and I am sure to a few others! I don’t want to render my code base unusable.


1 Like

Just chiming in that I’m bumping this one again with the library folks.


1 Like

@jersey99 we started working on implementing new version of libraries (which won’t have this limitation and will unify how they work in both local and web IDEs). It took us into new areas like revisiting cloud compiler infrastructure which made it stretch longer than we planned. I feel your pain and we’re working on mitigating it :smile:

1 Like

Any updates on this? (The SparkCore version of the FastLED library has fallen pretty far behind the main library repo - and the need to flatten structure is causing problems again).

Any chance of Spark and Arduino unifying their requirements for libraries and layout? (The need to maintain a separate directory layout and organization and repo has effectively made the spark/photon bastard step children).

(This checkin brought to you by dealing with some photon specific issues and remembering why I don’t have this in my regular library testing/dev rotation)

You actually can now keep the structure, you just need to provide a particle.include file to tell which (“nested”) files to upload.

Can particle.include also be used for submitted libraries - or is it only for code directly in the project?

Nope, you have to download the files into your project folder.
The subfolder you place the files in has to be named and structured as it is on Build, otherwise you’d need to adapt all include statements accordingly.

Does build allow libraries to have nested structures yet? The spark version of FastLED is significantly behind the mainline version because I’m using nested directories to manage the code/structure and I don’t want to also have to juggle changing the entire directory structure for particle, so I’ve been holding off (the base directory layout being different from what is used in most other environments (and it being rigid about that) is enough of a thing to work around)

I’m trying to find an answer to this myself, it’s disappointing that I can’t use the latest FastLED features on the Photon, but I completely understand why you don’t want to manage two code bases when one should do.

Back in October:

From last week - looks like this isn’t supported yet:

The library example repo has a V2 branch, but hasn’t been updated since September:

It’s using a file like Arduino libraries, but doesn’t specifically mention nested folders in the README. Fingers crossed that they release this soon, and include nested folder support.

1 Like