User Personal Libraries (Spark CLI)

Hey guys! Great product. I am loving the easy connectivity to WiFi, especially being able to program over WiFi

I am an Arduino developer working with several clients who want the SparkCore to be the center of their prototype, and possibly even their final product. I navigated away from the traditional arduino development environment to the BareArduinoProject – check them out, they let you do all things arduino through the command line and using a traditional project structure (with src, lib, bin etc folders)

I think that this will be the way of the future for “serious Arduino developers”. Arduino is more and more becoming a tool in industry and profitable kickstarters – and people would like to have code completion etc with their favorite code editor (i.e. Eclipse, vim, etc), library management, git integration, and other features that coders are used to.

I am trying to implement a similar thing using the SparkCore. Looking at your documentation for the cli I thought this would be fairly easy by using spark.include files. This is intended to document some of my trials and some suggestions I have to make your CLI better.

Here is the code:

allduino.h (located in lib/allduino/allduino.h – allows you to compile for either Arduino or Sparkcore)

#ifndef __allduino_h
#define __allduino_h
// General include for all files.
// has some basic macros and includes proper arduino libraries
//

#ifdef SPARK
 #include <application.h>
#elif (ARDUINO >= 100)
 #include <Arduino.h>
#else
 #include <WProgram.h>
 #include <pins_arduino.h>
#endif

# Your analog values are different! I will be adding values like this in the future
#ifdef SPARK
 #define ANALOGUE_MAX 4095
 #define LED_PIN 7
#else
 #define ANALOGUE_MAX 1023
 #define LED_PIN 13
#endif

#endif

blink.cpp (in src/blink/blink.cpp)

#include <allduino.h>

void setup(){
    pinMode(LED_PIN, OUTPUT);
}

void loop(){
    digitalWrite(LED_PIN, HIGH);
    delay(1000);
    digitalWrite(LED_PIN, LOW);
    delay(1000);
}

spark.include (also located in src/blink/spark.include)

../../lib/allduino/allduino.h

How I am compiling
I go to the src directory and type
spark compile blink

It works!

Suggestions

  1. The spark.include cannot list directories (i.e. …/…/lib/allduino will not work!) – this makes it much more difficult to develop libraries, as I will have to include each file name individually!
  2. spark compile from within the directory should compile the directory (i.e. compiling from within the blink directory should be the same as spark compile ../blink
  3. There should be a way to set a variable in the spark.include file. i.e. I would like to set LIB_DIR to …/…/lib and then just do $(LIB_DIR)/allduino

Otherwise this is a great interface, and I am happy you have gotten command line tools at least functional so quickly! Great work, and keep it up!

1 Like

I’m now trying to integrate a past project and having serious issues.

Right off the bat, I am trying to include a DHT library. I have it included in my spark.include file, but I get this error

attempting to compile firmware 
pushing file: ../../lib/allduino/allduino.h
pushing file: ../../lib/DHT/DHT.cpp
pushing file: ../../lib/DHT/DHT.h

...

Errors
In file included from DHT.cpp:7:0:
DHT.h:3:22: fatal error: allduino.h: No such file or directory
 #include <allduino.h>
                      ^
compilation terminated.
make: *** [DHT.o] Error 1

Compile failed -  compile failed 

This is despite the fact that allduino.h is the first thing I include in my program, so you think it would have thrown an error right away!

Edit
I took this one step at a time, splitting up some libraries. It seems that your compiler has problems including more than one library (and using libraries between libraries). So, if I have the directory

lib/
  DHT/
    DHT.h DHT.cpp
  allduino/
    allduino.h
  usertools/
    *lots of stuff

and I try to compile, it doesn’t work. However, if I move allduino.h into usertools (and delete the directory allduino) then the files in usertools can use it fine. – however, DHT still cannot use it.

What I ended up doing is adding DHT.h and .cpp to my project folder by soft linking them (ln -s …/…/lib/DHT/DHT.h etc) and removing any reference to them from spark.include – When I compile it now, it works! (horray?)

Summary
No library can reference any other library. I am still working on whether you can have more than one library in the first place.

1 Like

you may want to have a look at my local build setup:

full disclosure: i’m one of the contributors to Arduino-Makefile which BareArduinoProject is based on.

i’d actually like to get Spark working with the arduino-mk, but the mass of repo’s and toolchain required makes it not an easy job - hell even arduino 1.5 doesn’t do a great job with 3rd party arm boards.

1 Like

Thank you for the share and the suggestions! Just want to make sure I open issues for these:

1.) There’s a discussion happening on this one here: https://github.com/spark/spark-cli/issues/62
2.) Opened this one: https://github.com/spark/spark-cli/issues/84
3.) Variables in the project file is interesting, lets say environment variables with that syntax: https://github.com/spark/spark-cli/issues/85

Thanks,
David

Thanks sej7278. It looks like that is a potential solution.

As for me, I have been doing it the “hackerish” way by just soft linking files in my library folder into my project. This requires you to change #include <DHT.h> to #include "DHT.h" – but it is a temporary fix and it works well enough for me (for now)

Thanks Dave for opening those issues! From what I’ve seen so far, you guys are great at handling feedback and taking it into account. If I were you, I would work with the Arduino Makefile people to integrate it all into that style of development environment – it is really beneficial for developers to be able to use standard tools.

I really appreciate what you have done to make WiFi connectivity as easy to use as an Arduino! It is a big step – keep it up!

1 Like