Spark CLI on Raspberry PI?

Has anyone gotten the CLI installed and running on a Raspberry PI?

Never tried but it should work. The only thing is you need to check out online how to get node.js on it :smiley:

Another 2nd for “it should work”. I tried to test it out, but all 3 of my RPis won’t boot for whatever reason. They’ve all been sitting dormant for a couple of months and worked fine a while ago. I haven’t logged into them or touched them since, and none of them work now!

2 Likes

I’m trying to get it going. If the CLI really needs this ppa from Chris Lea, then I have to wait for a Debian version. I have no idea what ppa does, but it’s needed for Ubuntu CLI. I think Ubuntu and Debian are related. A Debian version is coming.

A PPA is like another apt repository (I’m not 100% sure on the exact relationship off the top of my head). Ubuntu is based on Debian, much like CentOS, Suse, and Fedora are based off of RedHat. For the most part, you can treat Ubuntu almost exactly like Debian.

If you are referring to the PPA that provides Node.JS, you can work around that and compile it from the command line. It takes it a while, but it’s not difficult to do. Here’s how to do it from the command line:

git clone https://github.com/joyent/node.git
cd node
git checkout v0.10.29-release
./configure
make
sudo make install```

Those are from memory, but I think that should be about it.  Since it's not part of a regular `apt-get install`, it won't be automatically updated like other packages, but I don't find it too terrible to occasionally check the latest version on [NodeJS.org](http://www.nodejs.org), check out the version branch (in the form of `vX.YY.ZZ-release`), re-compile, and re-install.  I'm a bit of a nerd like that, though!  I love some fresh, pure Node.JS!

@Dave, I have Spark CLI installed on my rPi but the question is, where do I find the spark.config.json file?

Heya @peekay123,

You can find the config file in your users .spark folder, so:

cat ~/.spark/spark.config.json

opened an issue here: https://github.com/spark/spark-cli/issues/71

I’m also planning on adding some CLI support for easily switching API servers as well. :slight_smile:

Thanks,
David

2 Likes

@Dave, no such file!!! Well, no such file until I did a login to the Spark Cloud THEN it created the file. :smile:

1 Like

Hey, Just a heads up that I’ve added config switching support to the Spark-cli!

Make sure your copy is up to date with npm update -g spark-cli

Then you can switch around with:

spark config local apiUrl http://your-ip-address:8080

#switch to local
spark config local

#switch back
spark config spark

Thanks!
David

2 Likes

Figure I’ll throw this here if anyone else has this problem - after getting the cloud working on my pi I had some trouble getting it going as an upstart service. Finally got this conf script to work:

#!upstart
description "Spark Core Server"
start on started mountall
stop on shutdown
respawn
respawn limit 99 5
chdir /spark/spark-server/js    
exec node main.js >> /var/log/sparkserver.log 2>&1

Problem was I was trying to do exec node /spark/spark-server/js/main.js without the chdir and it rejected each connection with handshake problems.

1 Like

Hi @snapdan,

Thank you for posting your upstart script! Any chance you would want to add that script to the spark-server repo? ( https://github.com/spark/spark-server/pulls ). The only caveat is signing the CLA to help keep the project open source ( https://docs.google.com/a/spark.io/forms/d/1_2P-vRKGUFg5bmpcKLHO_qNZWGi5HKYnfrrkd-sbZoA/viewform ). :slight_smile:

Thanks!
David