To get started with the core I’m using it, had issue with the web IDE. Spark cli is fairly intuitive… but thats coming from a linux user… so take it with a grain of salt I guess.
On the one hand, I can’t really say “relatively actively” right now, but that’s because I just haven’t had enough time to play with my recently, in general. But I have played with the CLI, and I can think of lots of ways I could use it (and probably will).
I use it regularly. I have a high latency internet connection so I cloud compile and flash locally. The other services a useful for diagnostics or analysis.
I haven’t been using it because of initial issues with serial debugging turned me away from it temporarily and I just haven’t gotten back to trying it because I haven’t been hearing of updates to it… but for the most part I develop locally because that’s way faster than OTA updates. For quick compile tests I like the web IDE (Sparkulator) because it’s visual and I have all of my Core’s sitting pretty in there ready to select.
If we could develop a Sublime Text Package that incorporates the Spark CLI functionality, but compiles and programs locally, I think that would be the Killer App.
Hmm, oooh… just had a great idea. Setting up toolchains is a pain in the you know what… not many people want to do it. MOST people just want to download a TOOL and go to town with it. So, somehow (programming elves)… make a local GUI version of the web IDE using something like Sublime Text Package or Processing like Arduino did, or even Visual C++, and Utilize the Spark Cloud to compile your source, but instead of flashing your Core OTA, download your BIN from the Cloud and program locally using DFU-UTIL in the background! It would be Super fast, and a lightweight install for all users. cc: @Dave
Please don’t tell me this already exists, because I’d feel kinda out of the loop on the cool stuff.
Ctrl+Shift+p, type flash to run spark flash firmware core-firmware.bin and flash via USB. You could also add a variant to flash via the cloud pretty easily.
yeah but... it's not the same as this: One click FLASH that uploads to the cloud (Unicorns Inhale), compiles, downloads the bin, and programs locally via dfu-util (Unicorns exhale)... boom!
That’s if your code is flawless when compiled and some magical hands to put the core in DFU.
But still I love the idea. I’m always having a One click keys repair in my list. Well at least for windows users. You need too much stuff installef as a beginner just to repair keys!
For those who are following this post, just a screenshot to show how useful the Spark-Cli is to me.
I literally fire it up the moment my core starts breathing
I’m hoping to write up a simple tutorial to spread the awareness and capabilities of the Spark-CLI while we start working on better docs to serve the word.
For now, WINDOWS user :D, you can follow my tutorial to install Spark-CLI and take over the world
I tried this and I'm not figuring it out quite as easily as I thought I would... so I'm adding it to my backlog and will revisit it in a couple weeks! I think it worked, but there's more to it that I need to set up: http://i.imgur.com/j1pF5sU.png
Definitely want to get this hooked up so that I can edit a file called application_FacebookLikes1.cpp, press CTRL+B (this is also F7 btw) to perform the following steps:
Make a copy of the file to c:\spark\core-firmware\src\application.cpp
Run my build batch file, or just invoke 'make'
This way I'm not blowing away my current project because it's named application.cpp when I do a git pull and it downloads the tinker app again. Sometimes I forget to save it back to the other unique name before I do that pull.
F8 will be a new assigned key dedicated for DFU.
CTRL+SHIFT+B will be make clean, and if I mirror this as SHIFT+F7 I will.
I use Spark-CLI for almost everything. I keep my source files locally under old-fashioned RCS source control and use a Makefile to submit these to the Spark Cloud for compilation. I don’t generate the binary locally, it is loaded into the Spark Cores via the cloud.