Compiling & using external libraries with Spark Core

Probably this questionis already answered somewhere on this forum. I want to import an external git project/library into Spark IDE. Is this possible? Can I make this same library, after compiling in the Spark IDE, re-usable?
In fact my question is: how can I use external source C projects (or build my own external libraries) for use with the Spark Core? Can I do this in the Spark IDE or by using the command line?

Thanks.

@gdillen, take a look at the documentation here:

http://docs.spark.io/build/#flash-apps-with-spark-build-using-libraries