When to 'clone' or 'update' local copies of spark firmware

For those of us that build locally when is a good time to update our local copies of the source? Is there any notification process? I ask this because when I am debugging a problem I consider doing a git ‘clone’ but I fear I will introduce some other problem by getting the source at a random time. Do you have any recommendations for when to update our version of firmware?

Hi @mtnscott,

Good question! Although we’re frequently improving the core-firmware repos, you might not always want to use the absolute latest version on master, since things might still be in testing. Right now there are a few ways to answer the question “Is this commit stable?” For starters, we’re transitioning to a system of more frequent feature rollouts and updates, which we’ll post about on our blog: http://blog.spark.io/ , we’ll talk about new code there. :slight_smile:

If you’re watching the repositories there are some things to watch for. When enough good changes have built up, and they’re stable, we will use git to “tag” a commit with something like “spark_n”. These versions are special, because those are the versions that new cores using Tinker will be automatically-upgraded to, and the version that will be installed at the factory. These versions need to be really good, since they’ll be the “Factory Defaults” version. Here was the last one:

We also keep a separate stable branch for code that’s built using the web IDE. This branch is updated more frequently when improvements are ready to start rolling out to everyone. Right now this branch has a not-very-pretty name, and it’ll probably be changed in the near future to something like “cloud-production”. We’re very close to pulling this branch forward again, but we want to be really confident before rolling out changes. You can check it out here:

If you’re building locally and using git, you can always stay current with these changes with a few easy commands. Assuming you’re working inside a directly cloned the repo, you can do something like:

##get everything 
git fetch --all

##list all the branches (there are a lot right now)
git branch -a

##start coding on your own local branch
git checkout -b my_awesome_project

##pull in changes from a branch
git merge compile-server2

If you’re better at git than I am (pretty likely), and if you’re working in your own repository, you can do fancier stuff like setting our repo to be an upstream source, and merge changes from that into your repo:

git remote add upstream git@github.com:spark/core-firmware.git
git fetch origin -v; git fetch upstream -v; git merge upstream/compile-server2
#(note: I found that command here: http://gitready.com/intermediate/2009/02/12/easily-fetching-upstream-changes.html )

I hope that helps!
Thanks,
David

1 Like

Thanks @Dave I am using eclipse and I just updated my local repository from the compile-server2 tag on the remote repo. I rebuilt everything but now I can’t seem to get my app to run.

Here is how I build my firmware - I have a command line script that copies my app from my workspace to core-firmware/src overwriting the application.cpp file, runs make and then I put the spark into flash mode and flash the binary using dfu-util. I get the breathing cyan, however my app is not running.

This is very strange, not sure what happened. Did the architecture change on the compile server? Is there a different setup / loop ? I also noticed that the compile-server branch now has a stub for application.cpp file. It used to contain the tinker app.

Hi @mtnscott - there is a small difference when you’re building on the Cloud vs. locally - because the local build doesn’t go through the Arduino pre-processor, you need to start with #include "application.h" and function prototypes (see the default application.cpp for an example).

1 Like

Hi @zach, thanks for the quick response. I just updated to the head on all trees, and the build works fine - my setup and loop functions are called. So I confirmed my command line process works. So just so I was not crazy I updated back to the ‘compile-server2’ tag on all branches and the make completes however the binary does not execute my setup / loop functions. I confirmed that I do have the #include “application.h” at the top of my app. All I do is replace your application.cpp file with mine and make. But for some reason when I use the ‘compile-server2’ tagged files my setup / loop functions are not called.

maybe try make clean dependents all?

@zach on my system “make dependents” in core-firmware/build generated an error “make: *** No rule to make target “dependents”. Stop.”

So I guess on my machine make does not know how to make that rule “dependents”. :slight_smile: Anyway I will just move to the current head and work from there since that seems to work correctly. I’ll try flashing OTA and see if that makes any difference.

UPDATE: I just uploaded my app to the WebApp and compiled and flashed OTA, it worked only after I reset the :spark: - back to where you re-enter the WiFi credentials. I then tried to locally flash using the ‘compile-server2’ version and no luck, still won’t execute my setup and loop.

Maybe there are different flags used on the compile server, or the socket connection is already established when flashing OTA. Anyway it appears to not work flashing that version locally.