Script to ease building projects locally

Hi

I have recently started a couple of projects based on the great Sparkcore board and although the online IDE and the cloud building functionalities are great they are also very slow. If you are like me and want to code small iterations and build frequently it is a bit frustrating…

So I moved to code and build locally.

Here the problem was that being working on two projects in parallel meant having two copies of the core-firmware repo and if you use a code versioning system for your code (GIT, SVN,…) you have two main options: forking and have big repos or ignoring a bunch of files (but having the whole firmware code in the working copy, presumably outdated).

So I decided to go the other way round. Projects with just the minimum required files and a way to transparently “merge” and “unmerge” a single core-firmware folder with my project. I have created a small bash script that does that and allows you to build and upload the code from the command line without leaving you project folder.

I don’t know if I am reinventing the wheel but “my” wheel is making me save a lot of time. I can manage my build from within my favourite IDE (vim) keeping the main core-firmware folder clean and up to date.

You can check out the code at https://bitbucket.org/xoseperez/spark-util. There is a simple README file with documentation.

Any comment will be very welcomed.
Xose

1 Like

This sounds exactly what I want to do… I don’t know Arduino well enough to write long verses in it… I’m currently a write little flash often type of guy…

Nice! Sounds like this is really helpful when doing a full build against firmware. Thanks for sharing!

My goal is to release a simple command line tool tomorrow that will include a ‘cloud compile’ option. It uses the same build system as the build site, so it should be very convenient, but it’s not a substitute for a full local build since then you can modify everything. :smile:

@Dave it’s Friday ))) time to publish :smile: I can not wait :smiley:

Today is the day! There are some API and build farm improvements that go with this release as well, so I need to make sure those are in place and happy before deploying. Today though!

Thanks,
David

1 Like

OK. So, the command line build and flash functionality requires some changes I made to the API and build service. The CLI tool is ready to be released, but I haven’t rolled out the cloud changes yet because we try to be careful when rolling out updates to production. If you wanted to play with the CLI (which is still really rough), you can grab it with:

npm install -g spark-cli

The repo is here: https://github.com/spark/spark-cli

Really though, spark cloud flash, and spark cloud compile won’t work yet until I roll those changes to production tomorrow. I’m hoping to have that out by around noon CST tomorrow (02/22/2014). :slight_smile:

Documentation and more features to come, lets keep calling this a beta for now, pull requests welcome.

Beta CLI Functionality that is working now:

spark cloud - login, logout, list, claim, name, remove
spark serial - list, monitor, wifi, identify
spark keys - new, load, save, send, doctor
spark variable - list, get, monitor

Thanks!
David

4 Likes

A bit offtopic, what is changes list you will be rolling out? Only API adjustments?

Hey Guys,

So I’ve been testing the cloud CLI changes on staging this morning. I’m still finding some issues, and I don’t think some of the recent cloud changes are ready to roll out to production. I’m bummed, but I’m glad testing caught these before rolling out.

I’ll have more time to fix issues and monitor an upgrade on Sunday. Externally these upgrades really only effect the cloud compile components of the command line tool right now.

What’s changed? After this sprint, the API has new endpoints for more easily compiling, getting binaries, and flashing cores directly. I’ve also made some endpoints more consistent with error and status codes. We also upgraded the internal messaging system version, upgraded the host operating system, and other packages. The compile service will also no longer pre-process library files, and lots of other small bug fixes and improvements.

Thanks!
David

Okay! Cloud upgrades tested and deployed! You should now able to use the CLI for building and flashing code to your cores. :slight_smile: Make sure you use the “.ino” file extension instead of “.cpp” if you want the file to be pre-processed. I still have lots of documentation and features to add, but feedback and CLI pull requests definitely welcome.

Thanks!
David

Hi @Dave

I think the new CLI is great but I have feature request:

What if spark cloud flash and spark cloud compile took another argument that was the branch to compile against, like compile-server2 or master?

That would let folks try out stuff without having to do a local build.

@bko I mentioned this to @dave but it requires things like changing the public/private keys on the core, adding our core id to the server etc.

You can read more at: https://github.com/spark/spark-cli/issues/4

Meanwhile, let’s see what the team decides :slight_smile:

Hmm, it sounds like @bko is suggesting a branch parameter for the remote build, which is slightly different from that issue on the repo (switching your core to hit the experimental / testing staging servers).

The “please use this branch” argument is interesting, but would require some modifications to the build farm. The build farm would need to have a worker that could switch between branches instead of strictly using compile-server2. I would want to do some research, I think the biggest cost would be dramatically increased build times when switching branches, since you’d need to do a clean build on the other repos. But if it was a dedicated ‘alternate branch’ worker, then maybe the you would know that longer waits were expected.

Hi @Dave

You’ve got the idea! I have to say that I thought of this as I was waiting for git to copy down the latest Master branch so I could do a local build and it was being very slow.

I am not sure how much use this feature would have, but it seems like the user base is people who want to try out the latest Master branch (“experimental” branch) to see if some bug fix works for their problem, but don’t want to build locally.

Just a thought.

1 Like

I think you’re right in that it would be really helpful when playing with experimental builds, I added it to the build farm backlog for the time being.

Thanks!
David