Local Cloud on Raspberry Pi

Hello community!

I have started documenting the steps i have taken to setup a Raspberry Pi with Local :cloud: from scratch!

Not really an experienced user of linux so if you have any suggests on what should be added in for configuration, let me know!

The public repo is here: https://github.com/kennethlimcp/book-rpi-lc

The tutorial will appear here: http://kennethlimcp.gitbooks.io/spark-local-cloud-on-raspberry-pi/content/

Thanks and have a great weekend! :wink:


I’m almost done for the next push!

Just short of one node module which is not installing properly!!!

@BDub, how can i run the server on a “separate” terminal? It will spill out console logs but i still want to have my command line for other usage :wink:

Just open another command prompt? I’m not exactly sure what you are doing… is this on Windows of RasPi? I have a raspi in a drawer somewhere… waiting to do something cool with it :wink:

1 Like

I’m using the SSH client on Mac OS to login to the Rpi.

Figured out that i can do that but would be cooler to just type a command and launch a new terminal :smiley:

Are you going to dig it out?!

I can push another version 1st which covers all the installation.

Working on the last part for spark-server installation now :wink:

1 Like

I would love to but I’m a bit tied up with other Spark things at the moment. I’m currently looking into this OTA issue.


the tutorial is now live :smiley:

have fun!


WOW! Just in time for this? No specs upgrade… just more IOs etc. :wink:

I’m excited that now the stuff like HDMI and Video out is removed which makes it more like a commodity hardware server style :smiley:


@kennethlimcp, You can use screen (sudo apt-get install screen) to launch a separate terminal process on the Raspberry Pi to let something run in the background. There are a ton of options for it, but the basics are:

  1. On a command line, type screen and press enter.
  2. Press spacebar or enter to continue past the warning/disclaimer screen
  3. You should now be presented with an empty terminal, which now means you are in the new terminal process. Type whatever command you want to have running in the background.
  4. Press ctrl+a then d to exit the screen while leaving it running in the background.
  5. Type screen -r to resume that screen session.
  6. Press ctrl+d to log out of the screen session when you’re done.

It is especially useful for the Raspberry Pi since it’s really, really slow. I’m using it now to compile Node.JS on a Raspberry Pi (to set up the local cloud and Phant). Hopefully I can get another port in our data center and to put the RPi on it and have a public IP (instead of this LAN/firewall nonsense).


YOU ROCK @wgbartley :slight_smile:

Also, I’m on the list for hummingboard (solid-run.org) and will receive one the moment they ship! You might want to consider that in future since it’s more scalable and using vanilla linux.

Will do up a tutorial for that the moment I have it.

I have an Odroid U3 at home that I use for my 3D printer. The RPi couldn’t keep up and would cause prints to lock up for several seconds at a time. I want to get my hands on as many computer-on-a-board products as I can (CuBox, Parallella, Udoo, etc) and put them through their paces. I’d also like to have the extra cash to spend on them as well!

1 Like

hi Guys!

Awesome stuff! Just installed it on a local pi, and was able to start the server.

Although one question, how can i get the spark codes to connect to the local server, instead of the online one?

I read work in progress for dynamic dns tutorial, but could somebody help me please?

Kind regards,

1 Like

Hey @Noten,

You can follow the guide here: https://community.spark.io/t/tutorial-local-cloud-1st-time-instructions/5589 :slight_smile:


hi @kennethlimcp

Thanks for the link!

I did all the steps, but the core is just fast blinking with white colour, also on the terminal window the core is not appearing.

At this step:
7.) Go to cores_key directory to place core public key inside

cd core_keys
place core in DFU-mode
spark keys save INPUT_CORE_ID_HERE
NOTE: make sure you use theCORE ID` when saving the keys!

Reset the core manually by hitting the RST button

It was a bit confusing for me because the core_keys folder is on the server, but the spark keys save… command has been run on my local machine. It saved a file, and i manually copied the generated file to the server’s core_keys folder. Maybe that should be done on another way, not sure, but at the moment it is not working :frowning:

If anyone has a solution please let me know, this would be ace to run the cores from a local environment.

Thank you, Noten

I’m sorry :smiley:

The instructions were done up when I ran the Local :cloud: on my laptop which explains why.

You can save the core key and copy to the dir in Rpi.

Which platform are you on for your laptop?

hey @kennethlimcp !

Using a mac.

Heya @kennethlimcp,

The command spark keys save INPUT_CORE_ID_HERE won’t work unless you convert that DER key into a public pem format key.


hi @Dave!

Do you have any command for this on mac? I could test it immediately.

Hi @Noten,

Sure! The easiest way is to load new keys onto your core, so try:

spark keys server /path_to_server/spark-server/js/default_key.pub.pem mine
spark keys new your_core_id
spark keys load your_core_id.der
cp your_core_id.pub.pem /path_to_server/spark-server/js/core_keys


@Dave, which are you referring to?

I use spark keys save core_id to grab my core keys and dump in core_keys dir.

Hi @kennethlimcp,

spark keys save will copy the raw private key in DER format and the wrong size from the core, and the server needs the public core key in PEM format, no?


1 Like