Sprint 6: Maker Shed, Command Line Interface, a getting started video, and an open source workflow

Written up in our blog:

http://blog.spark.io/2014/02/24/sprint-six/

But in case you’d rather read it here:


We’re all pretty big nerds at Spark HQ, so it was pretty cool to see this in my inbox on Thursday.

Spark Core on Maker Shed!

Maker Media is kind of a big deal in our little corner of the world, and they’ve been supporters of ours for a while. Behind the scenes we’ve been working with them for quite some time to get the Spark Core on their shelves.

We’re pleased to announce that the Spark Core and Spark Maker Kit are both available for purchase on Maker Shed!

We hear they’re selling fast, and they will likely be sold out in the very near future — so get yours now before that happens!

Getting started with Will

Will, the tallest member of the Spark team, put together an amazing ‘Getting Started’ video that walks through the set-up process of the Spark Core. This is the first of many video tutorials, as we’ve found that there’s no better way to learn than to follow along!

Spark Command Line Interface (CLI)

Speaking of big nerds, we love the command line. If you spend a lot of time writing code, you often find that graphics just get in the way of the pureness that is text.

This week, we’re officially launching the Spark Command Line Interface, or CLI. The CLI is still in active development, so expect significant changes over the next few weeks. In the meantime, the CLI is now the fastest and easiest way to get started with your Spark Core.

Before getting started with the CLI, you’ll need to have Node.js and npm installed. Go ahead, I’ll wait here.

Ready? Try this:

npm install -g spark-cli

Now you’ve got the Spark CLI installed. Next:

spark

will tell you all the commands available. Here are some of my favorites:

spark cloud login to log in to your Spark account so that you can interact with your Cores.

spark cloud list returns a list of the Cores you own, and displays information about their status. statuses. statii?

spark cloud flash 0123456789ABCDEFGHI core-firmware.bin will flash your Core with a binary file of your choosing. Or you can flash my_application.ino to send a single Arduino file, or /projects/big_app/src to send an entire directory.

spark variable get all temperature returns the temperature variable from all available Cores.

spark serial wifi will help you connect your Core to your Wi-Fi network.

There’s lots more where that came from; for a full list of available commands, simply type spark in your terminal, or visit the Github repository.

And speaking of Github…

Our open source has a new home

As the volume of our open source content grows, it’s become a little unwieldy. Now if you’d like to browse our open source repositories, visit spark.github.io to see an organized view of the open source tech stack for connected devices that we’ve been publishing over the last few months.

Using the Github tools we’ve made available, you can star a repository to follow it for changes, create an issue to request a feature or share a bug, or track our workflow on waffle.io.

Starting with the core-firmware repository, we’re sharing our backlog with the community to get your feedback about our priorities and to get your help and input wherever you’re willing to provide it. Workflows for other repositories will go live over the next couple of weeks.

Other stuff

There were a few other things - improvements to the documentation, web IDE improvements to no longer send .cpp files through the Arduino pre-processor, API improvements, and bug fixes. But who can keep track?

Enjoy the improvements, and if you have any feedback, please share it here!

7 Likes

This topic is now pinned. It will appear at the top of its category until it is either unpinned by a moderator, or the Clear Pin button is pressed.

That Make pic looks awexome, they have the USB cable the correct way up now!

Spark CLI looks rad as hell! I want :smile:

I went to install and was getting errors about Python being too new and shiny, so I had to downgrade it from 3.3.2 to 2.7.6.

After that, node-gyp runs now… but getting some new errors. Here’s the whole log:

c:\Spark\CLI>npm install -g spark-cli
npm http GET https://registry.npmjs.org/spark-cli
npm http 304 https://registry.npmjs.org/spark-cli
npm http GET https://registry.npmjs.org/ursa
npm http GET https://registry.npmjs.org/request
npm http GET https://registry.npmjs.org/moment
npm http GET https://registry.npmjs.org/when
npm http GET https://registry.npmjs.org/serialport
npm http GET https://registry.npmjs.org/xtend
npm http 304 https://registry.npmjs.org/ursa
npm http 304 https://registry.npmjs.org/request
npm http 304 https://registry.npmjs.org/when
npm http 304 https://registry.npmjs.org/moment
npm http 304 https://registry.npmjs.org/xtend
npm http 304 https://registry.npmjs.org/serialport
npm http GET https://registry.npmjs.org/object-keys
npm http 304 https://registry.npmjs.org/object-keys
npm http GET https://registry.npmjs.org/qs
npm http GET https://registry.npmjs.org/json-stringify-safe
npm http GET https://registry.npmjs.org/forever-agent
npm http GET https://registry.npmjs.org/node-uuid
npm http GET https://registry.npmjs.org/mime
npm http GET https://registry.npmjs.org/tough-cookie
npm http GET https://registry.npmjs.org/form-data
npm http GET https://registry.npmjs.org/tunnel-agent
npm http GET https://registry.npmjs.org/http-signature
npm http GET https://registry.npmjs.org/oauth-sign
npm http GET https://registry.npmjs.org/aws-sign2
npm http GET https://registry.npmjs.org/hawk

> ursa@0.8.0 install C:\Users\ME\AppData\Roaming\npm\node_modules\spark-cli\node_modules\ursa
> node-gyp configure build && node install.js


C:\Users\ME\AppData\Roaming\npm\node_modules\spark-cli\node_modules\ursa>node "C:\Program Files\nodejs\node_modules\npm\bin\node-gyp-bin\\..\..\node_modules\node-gyp\bin\node-gyp.js" configure build
  asprintf.cc
  ursaNative.cc
c:\users\ME\appdata\roaming\npm\node_modules\spark-cli\node_modules\ursa\src\ursaNative.h(6): warning C4005: 'BUILDING_NODE_EXTENSION' : macro redefinition [C:\Users\ME\AppData\Roaming\npm\node_modules\spark-cli\node_modules\ursa\build\
ursaNative.vcxproj]
          command-line arguments :  see previous definition of 'BUILDING_NODE_EXTENSION'
c:\users\ME\appdata\roaming\npm\node_modules\spark-cli\node_modules\ursa\src\ursaNative.h(10): fatal error C1083: Cannot open include file: 'openssl/rsa.h': No such file or directory [C:\Users\ME\AppData\Roaming\npm\node_modules\spark-c
li\node_modules\ursa\build\ursaNative.vcxproj]
gyp ERR! build error
gyp ERR! stack Error: `c:\Windows\Microsoft.NET\Framework\v4.0.30319\msbuild.exe` failed with exit code: 1
gyp ERR! stack     at ChildProcess.onExit (C:\Program Files\nodejs\node_modules\npm\node_modules\node-gyp\lib\build.js:267:23)
gyp ERR! stack     at ChildProcess.EventEmitter.emit (events.js:98:17)
gyp ERR! stack     at Process.ChildProcess._handle.onexit (child_process.js:789:12)
gyp ERR! System Windows_NT 6.1.7601
gyp ERR! command "node" "C:\\Program Files\\nodejs\\node_modules\\npm\\node_modules\\node-gyp\\bin\\node-gyp.js" "configure" "build"
gyp ERR! cwd C:\Users\ME\AppData\Roaming\npm\node_modules\spark-cli\node_modules\ursa
gyp ERR! node -v v0.10.24
gyp ERR! node-gyp -v v0.12.1
gyp ERR! not ok
npm http 304 https://registry.npmjs.org/node-uuid
npm http 304 https://registry.npmjs.org/mime
npm http 304 https://registry.npmjs.org/json-stringify-safe
npm http 304 https://registry.npmjs.org/forever-agent
npm http 304 https://registry.npmjs.org/qs
npm http 304 https://registry.npmjs.org/tough-cookie
npm http 304 https://registry.npmjs.org/form-data
npm http 304 https://registry.npmjs.org/tunnel-agent
npm http 304 https://registry.npmjs.org/http-signature
npm http 304 https://registry.npmjs.org/oauth-sign
npm http 304 https://registry.npmjs.org/aws-sign2
npm http 304 https://registry.npmjs.org/hawk
npm http GET https://registry.npmjs.org/combined-stream
npm http GET https://registry.npmjs.org/async
npm http GET https://registry.npmjs.org/assert-plus/0.1.2
npm http GET https://registry.npmjs.org/asn1/0.1.11
npm http GET https://registry.npmjs.org/ctype/0.5.2
npm http 304 https://registry.npmjs.org/combined-stream
npm http 304 https://registry.npmjs.org/async
npm http 304 https://registry.npmjs.org/asn1/0.1.11
npm http 304 https://registry.npmjs.org/assert-plus/0.1.2
npm http 304 https://registry.npmjs.org/ctype/0.5.2
npm http GET https://registry.npmjs.org/delayed-stream/0.0.5
npm http GET https://registry.npmjs.org/punycode
npm http GET https://registry.npmjs.org/boom
npm http 304 https://registry.npmjs.org/delayed-stream/0.0.5
npm http GET https://registry.npmjs.org/hoek
npm http GET https://registry.npmjs.org/sntp
npm http GET https://registry.npmjs.org/cryptiles
npm http 304 https://registry.npmjs.org/punycode
npm http 304 https://registry.npmjs.org/boom
npm http 304 https://registry.npmjs.org/cryptiles
npm http 304 https://registry.npmjs.org/sntp
npm http 304 https://registry.npmjs.org/hoek
npm http GET https://registry.npmjs.org/bindings/1.1.1
npm http GET https://registry.npmjs.org/async/0.1.18
npm http GET https://registry.npmjs.org/sf/0.1.6
npm http GET https://registry.npmjs.org/optimist
npm http GET https://registry.npmjs.org/nan
npm http 304 https://registry.npmjs.org/bindings/1.1.1
npm http 304 https://registry.npmjs.org/nan
npm http 304 https://registry.npmjs.org/async/0.1.18
npm http 304 https://registry.npmjs.org/sf/0.1.6
npm http 304 https://registry.npmjs.org/optimist
npm http GET https://registry.npmjs.org/wordwrap
npm http 304 https://registry.npmjs.org/wordwrap

> serialport@1.3.1 install C:\Users\ME\AppData\Roaming\npm\node_modules\spark-cli\node_modules\serialport
> node-gyp rebuild


C:\Users\ME\AppData\Roaming\npm\node_modules\spark-cli\node_modules\serialport>node "C:\Program Files\nodejs\node_modules\npm\bin\node-gyp-bin\\..\..\node_modules\node-gyp\bin\node-gyp.js" rebuild
  serialport.cpp
  serialport_win.cpp
  enumser.cpp
  disphelper.c
     Creating library C:\Users\ME\AppData\Roaming\npm\node_modules\spark-cli\node_modules\serialport\build\Release\serialport.lib and object C:\Users\ME\AppData\Roaming\npm\node_modules\spark-cli\node_modules\serialport\build\Release\se
  rialport.exp
  Generating code
  Finished generating code
  serialport.vcxproj -> C:\Users\ME\AppData\Roaming\npm\node_modules\spark-cli\node_modules\serialport\build\Release\\serialport.node
npm ERR! ursa@0.8.0 install: `node-gyp configure build && node install.js`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the ursa@0.8.0 install script.
npm ERR! This is most likely a problem with the ursa package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR!     node-gyp configure build && node install.js
npm ERR! You can get their info via:
npm ERR!     npm owner ls ursa
npm ERR! There is likely additional logging output above.

npm ERR! System Windows_NT 6.1.7601
npm ERR! command "C:\\Program Files\\nodejs\\\\node.exe" "C:\\Program Files\\nodejs\\node_modules\\npm\\bin\\npm-cli.js" "install" "-g" "spark-cli"
npm ERR! cwd c:\Spark\CLI
npm ERR! node -v v0.10.24
npm ERR! npm -v 1.3.21
npm ERR! code ELIFECYCLE
npm ERR!
npm ERR! Additional logging details can be found in:
npm ERR!     c:\Spark\CLI\npm-debug.log
npm ERR! not ok code 0

c:\Spark\CLI>

I even updated node to v0.10.26 and I’m getting the same error. Google search is not popping up the answer super fast like usual :wink:

FYI: npm comes bundled with nodejs now for a while so you don’t need to download that separately…

EDIT: Got it figured out over here: https://community.spark.io/t/tutorial-spark-cli-on-windows/3112

Will!! haha, nice… I keep thinking you are going to bust out singing. Great looking video! Also caught a little bug in the video editing… the Blink an LED example flashes output D0, but as shown on the video the on-board D7 LED is flashing.

Cool! Does it mean we can do Over The Air update from local build environment (Eclipse) ?

1 Like

Yeah! That’s one cool feature the command line tool was indeed for!

1 Like

Hey @zach: This Sprint brings a lot of great stuff! Congrats to all!

I did want to bring to your attention that your shiny new cert does not cover your spark.github.io domain and so I get:

Thanks again!

Use http://spark.github.com. The browser probably redirected from another https site and used original https prefix. That site is hosted @ github.com http://pages.github.com/ …Its fault of github.com! If you click on Technical Details, you will see. The certificate is only valid for the following names *.github.com , github.com …So github has to buy a certificate for *.github.io! :slight_smile: Those certificates are not cheap at all.

@bko thanks for pointing that out, how did you come across that bug? did you follow a link from somewhere?

Oops think I found it; was the link in my own post :slight_smile: Fixed, I think

1 Like

I was wondering who would be the first to notice :smile:

We modified the "blink-an-led.ino" example so it should still work without modification. We added the D7 LED flash along with the D0 flash, which has the added benefit of helping people debug backwards LEDs in that example.

1 Like

Well, am I missing something or are they getting the Spark Core before, those who have pre-ordered it in your store a along time ago??

@simoher Maker Shed purchased their Cores from us quite a while ago, before current pre-orderers. Plus, we don’t charge credit cards until Cores are shipping, so if anyone saw the Maker Shed sale and wanted to cancel their order and place it with Maker Shed instead, they definitely can!

1 Like

so here’s the idiot again…

i have an ubuntu 13.10 pc running. i updated everything, and i installed node.js and npm with the apt-get install nodejs.
i had some issues installing the NPM GIT so i went with the new version of node.js by adding the repository. after that, still a no-go…

is there someone who can post a tutorial on how to get this npm spark running while using ubuntu 13.10 starting from scratch?

@korneel Don't think you need NPM GIT?

If Node.js is running then is a one-liner thing:

npm install -g spark-cli

sorry, i meant installing the SPark-CLI using npm install -g spark-cli.

when i did that, it complained about an old version of node.js so i added the repository.
any, so i then did this in my terminal;

sudo apt-get install python-software-properties
sudo apt-add-repository ppa:chris-lea/node.js
sudo apt-get update
sudo apt-get upgrade
sudo apt-get install nodejs

now if i do node-v it says v0.10.26

running npm install -g spark-cli will tell me to run it as root.
fine, so i run sudo npm install -g spark-cli

and ofcourse now it works.

ok. no clue why it did not work before… i just redid everything after a sudo apt-get remove nodejs just to show you the exact error… this sucks…
well it doesn’t, since it works nopw. hey ho, happy times.

1 Like

Can you see if serial works? I sent in a fix but @dave has not merged yet.

Thanks!

Does spark-cli work on Mac yet?
I tried installing it and got these errors:

2295 info preinstall cryptiles@0.2.2
2296 verbose readDependencies using package.json deps
2297 error Error: ENOENT, chmod '/usr/local/lib/node_modules/spark-cli/node_modules/request/node_modules/hawk/node_modules/sntp/lib/index.js'
2298 error If you need help, you may report this entire log,
2298 error including the npm and node versions, at:
2298 error http://github.com/npm/npm/issues
2299 error System Darwin 13.1.0
2300 error command "node" "/usr/local/bin/npm" "install" "-g" "spark-cli"
2301 error cwd /Users/jacob/Documents/sparkcore
2302 error node -v v0.10.26
2303 error npm -v 1.4.3
2304 error path /usr/local/lib/node_modules/spark-cli/node_modules/request/node_modules/hawk/node_modules/sntp/lib/index.js
2305 error fstream_path /usr/local/lib/node_modules/spark-cli/node_modules/request/node_modules/hawk/node_modules/sntp/lib/index.js
2306 error fstream_type File
2307 error fstream_class FileWriter
2308 error fstream_finish_call chmod
2309 error code ENOENT
2310 error errno 34
2311 error fstream_stack /usr/local/lib/node_modules/npm/node_modules/fstream/lib/writer.js:305:19
2311 error fstream_stack Object.oncomplete (fs.js:107:15)
2312 verbose exit [ 34, true ]

@jfenwick i think it does cos @zach is on a Mac!

Have you installed node.js yet?

Yes. This is an error from running the node command.

What command did you issue?

Once node.js is installed correctly, we do npm install -g spark-cli on the command line