Library repository

I often read here in the community about libraries and if they are compatible with the spark core.

Is it a good idea to create a site to manage all the compatible libraries? I think there should be a single point to get spark libraries in their latest version. Then I can search for certain libraries, see if they are yet ported to the spark core, if someone is currently porting them, maybe port them myself and contribute the result to this page. So I have an overview and can see the status of a library, can leave a comment or a bugreport and can thank the author.

For now, if I search for a library, I have to search on Google or this forum, and maybe I end up on a server which I don’t know whether to trust, or if there are newer versions out in the wild.

I know creating such a site can be a lot of work, and you guys are busy with the spark core itself, but maybe if we start little and implement features with the time… for example, if you create a new category in this forum to collect all libraries and library requests, this would be a base :slight_smile: And maybe there will be a community member willing to create such a site, or maybe there already exists software for such a case?

What do you think about this?



I think it’s a great idea. Given that Arduino libraries aren’t automagically compatible with Spark Core we need to know which libraries have been ported/certified and can be used. Often library availability drives purchase of peripherals so we know which sensors, LCD displays, and whatnot to reuse from our workbench or go buy somewhere. I’m already seeing multiple forum threads about this (1-wire support, LCDs to use, etc.).

People are also volunteering to help port, so tracking that and getting community-valued libraries operational soon would be very, very useful.

And I’d be happy to beta test library ports if that was useful.


1 Like

Thanks guys - this is a great suggestion. My first thought is that we could manage it with a Github repository (like we do with our docs) and accept pull requests for contributions.

We’ll discuss as a team and figure out the best way to implement this and provide a mechanism for community contributions ASAP. Thanks guys!


This sort of thing is going to get really out of hand if not done well as time goes on. Frankly I don’t see why there can’t be a file / folder upload option to the spark cloud where users can maintain there own libraries that get linked against at compile time. I know I’ll be personally making good use of the ability to flash compiled code so I can write code and link it locally.

Good programers are going to want to break things up, and having to have everything managed through one repository won’t allow this.

I’m not saying that there is no benefit to having a unified library system, I’m just more immediately interested in better support for a full C workflow.


Just want to add my big thumbs up. Excellent suggestion @dominikkv for one clear place for library compatibility and porting status. The Spark team will definitely discuss our options here, and you make a strong argument for this issue moving closer to the top of the priority list.

Thanks for the offer to beta test @Disquisitioner!

And @nixpulvis—thanks so much for the feedback—it’s definitely one of our goals go make the Spark coding experience great for all programmers, no matter their level. We’ve been thinking that the folks you’re describing will prefer to manage multi-file projects in their own text editor / IDE and have something like a command-line tool for the Cloud interactions analogous to the heroku toolbelt, if you’ve ever used that. Thoughts on this?

I do have some experience with heroku, and I’m a huge fan of the way it works. I had an idea that this core stands to make possible a little while back, where the spark cloud would host git repositories and then it would run make flash or something on a post receive hook.

Now I can see allowing people to upload their own makefiles could be really challenging for security reasons, but in reality you could just expose a very feature rich makefile. This would be a very cool way to interact with larger projects.

I could put together a site that would be a nice little repository for Spark libraries and sketches. Basically a local copy of GIT in the backend for users to publish and manage their code. The actual front end of the site would allow community members to vote and review the stuff, to keep quality under control.

Eventually you’d have it so forking a lib or sketch from another user’s hub would just copy it to your own, personal hub where it could be edited it in place (like the Spark Build interface baked right into the site) and then compiled and uploaded through the Spark Cloud API! However, unlike Github, these forks would not be published, it would be specific to you. (Think of it like downloading an Arduino library and extracting it into the ~/Arduino/libraries folder.) You could always make a bunch of changes and then publish it yourself, or submit changes back to the originating library/sketch.

You could also have the ability to #include these forked libs/sketches into one of your existing sketches. So, let’s say I’m working on a sketch called Thermostat; user Joe has a library called DS18B20 and user Bob has OneWire, both of which I’ve forked. While working on my Thermostat sketch, I could #include Library_Name at the top. When I hit the Flash button on the built-in editor, it parses the file for these include statements, then concatenates them into the sketch (behind the scenes) before uploading it through the Cloud API for compile and upload.

Man, that would be so awesome. I’m going to look into the technical challenges for it tonight, see what sort of web stuff is out there I could use to sort of hack this together…


We’re planning on getting something into the Spark Web IDE very soon that allows you to pull in example code easily. The tentative plan is to have the example code live in an open source git project so people can use it in either the Web IDE context or locally with local firmware build tools. Once we release this, one possible next step would be to figure out how to make it easy to do ‘pull requests’ from the IDE back out to the example repo. Perhaps we can come up with a mechanism that will also work for centralizing ported Arduino libraries like described above.

@timb , @nixpulvis : both awesome ideas, please keep them coming–we want to find a way to make these kinds of things happen!

Has any more work being done on a library repository?

Just a massive github repo of all spark core ported and created libraries which have been either user verified or verified by the core team (obviously user verified would be the majority) it would also allow for small bugs to be removed from libraries OR made note of.

Yes! This is still in progress! :slight_smile: Hopefully should be available in the next few weeks or earlier.

Yay! It would be nice for those new to the spark core to have a single place for libraries which are not standard in the web IDE (or hell, integrate a library chooser into there as well) and if a library isn’t there either a. it can be a request which may be fulfilled or b. some kind soul will see it and make/port one.

1 Like