Local Cloud on Raspberry Pi

I have been doing this before pre-release and it works. Something is not right then :slight_smile:

OR I might have remembered wrongly with so much keys generated :stuck_out_tongue:

Yeah, that command shouldn’t work. :slight_smile:

The code that loads the key is here, it will only pick up a ‘pub.pem’ file in that directory with the core id as the filename, and it’s parsed with ursa.createPublicKey which expects a pub pem file.

1 Like

hi Guys!

I did the command @Dave wrote, but it is still just blinking like crazy in white.

Can i somehow see an error log or is there any meaning of this type of error “message” of the led?

Kind regards,

here is the video of the core:

and this is the output of the server:

    Loading user xxx
connect.multipart() will be removed in connect 3.0
visit https://github.com/senchalabs/connect/wiki/Connect-3.0 for alternatives
connect.limit() will be removed in connect 3.0
Starting server, listening on 8080
static class init!
found xxxxxxxxxxxxxxxx
Loading server key from default_key.pem
set server key
server public key is:  -----BEGIN PUBLIC KEY-----
-----END PUBLIC KEY-----

Your server IP address is:
server started { host: 'localhost', port: 5683 }

Hi @Noten,

Is your core and server on the same network? Did you write the server key with your server IP address?


hi Dave!

Yes, same network.
I wrote the server key from mac (the core is connected to the mac, while the raspberry pi is running separately). The server url was set in the .spark/spark.config.json file.

Are these ok?

Hi @Noten,

Make sure you build the server key using the IP of your raspberry pi, and not your mac:

spark keys server default_key.pub.pem ip.of.raspberry.pi



hey Dave!

Okay, so i downloaded the default_key.pub.pem file from the server, connected the spark core to the mac in DFU and run this:

spark keys server default_key.pub.pem

Then restarted the core, now fast cyan (x2) and fast red (x1) are blinking. :frowning:

I’m monitoring the server as well, but below the server started { host: ‘localhost’, port: 5683 } line nothing appears. I’m clueless :frowning:

Hi @Noten,

How is your raspberry pi connected to your network, are you doing it via your Mac’s internet sharing?

hi Dave!

Both mac & raspberry pi are connected via cable to the router. And the same router’s credentials are set on the core (wifi).

Hi @Noten,

Hmm… Well, if you’re not seeing a connection attempt in your server output, then the Core isn’t hitting the right IP address for some reason. Even if the key were wrong, you’d at least see a connection attempt.


when i run the command, should i run spark keys server default_key.pub.pem


spark keys server default_key.pem

Somewhere the tutorial says .pub file is required, somewhere i also read default_key.pem only. Which one is the correct?

Hi Noten,

I think it wants the pub pem file, is your copy of the CLI totally up to date, can you upgrade it with npm update -g spark-cli


i did the same process again, but now only the green is blinking.

Also is it enough that i just use the default_key.pub.pem file from the spark-server/js folder?

I can’t think of anything else that could go wrong :frowning:

Okay guys, it is working now.

I did the following (in this order):

  • factory reset for the core
  • connected to cloud again
  • flashed the local setup again to the spark core (BUT i used the default_key.pub.der file, instead of the pem)
  • did the cc3000 update and tinker update (as it didn’t connect to wifi at all)
  • done

Not sure which solved the issue (or the combination of more), but now it is working flawlessly.

Thanks for everybody’s help, good stuff :smiley:


Now just one last thing, any idea how to start the node main.js with the pi when it boots? I would like to start the local cloud when the pi starts (just in case if it reboots, the cores won’t go offline).

Is it possible, coz this would be ace!


1 Like

Hi @Noten,

I think you want an init.d script of some kind. Here’s a helpful tutorial site:


And I’ll open an issue to include a stock startup script for the server here:



That would be ace @Dave!

I tried with crontab, added this line: “@restart …/node /home/pi/…/main.js” but it generated quite many errors and i can’t fix it :frowning:

Looking forward for your script, let me know if you need testing.


1 Like

Hey @kennethlimcp thanks for the post and @Dave for the walk-through. However I am facing a problem with the public key. My core_keys folder is empty. I am running the cloud server in a Raspberry Pi ( And I have the Spark CLI installed in a ubuntu machine ( which I am using to flash the spark core ( My core ID is 55ff6b065075555332071787. On the server side it says:

Your server IP address is:
server started { host: ‘localhost’, port: 5683 }
Connection from:, connId: 1
Expected to find public key for core 55ff6b065075555332071787 at /home/pi/spark-server/js/core_keys/55ff6b065075555332071787.pub.pem .

Then the connection times out.

Here’s the steps I followed:
=>> In the Raspberry Pi,
1. Completed the server installation from http://kennethlimcp.gitbooks.io/spark-local-cloud-on-raspberry-pi/install_local_cloud_repo/README.html.

=>> In the Linux machine,
1. Edited the spark.config.json file. Added “apiUrl”: “” (ip of the raspberry-pi)
2. Created an account using $ spark setup and quit before providing any Wifi credentials. Account created.
3. Copied the ‘default_key.pub.pem’ file from the server (Rasp-Pi) onto the Linux machine and ran ‘spark keys server default_key.pub.pem’. Flashed.
4. Next 'spark keys save < Core ID>. However it threw me an error “Error saving key… Error: Command failed: invalid dfuse address: 0x00002000:1024”. So I ran the dfu-util explicitly “sudo dfu-util -d 1d50:607f -a 1 -s 0x00002000 -U 55ff6b065075555332071787.pub.pem” removing the 1024. It uploaded the the keys to 55ff6b065075555332071787.pub.pem file.

I copied 55ff6b065075555332071787.pub.pem to Raspberry-Pi in /spark-server/js/ to /spark-server/js/core_keys/ directory. But it is not working either. Throws the following error :smile:

Your server IP address is:
server started { host: ‘localhost’, port: 5683 }
Connection from:, connId: 1
Caught exception: Error: Not a public key.{}

Can you help me out on this or give the steps when Spark CLI and Local cloud are on different systems.


Is your core in dfu mode when you perform spark keys save core_id ?

1 Like