Local Cloud on Raspberry Pi

I have been doing this before pre-release and it works. Something is not right then :slight_smile:

OR I might have remembered wrongly with so much keys generated :stuck_out_tongue:

Yeah, that command shouldn’t work. :slight_smile:

The code that loads the key is here, it will only pick up a ā€˜pub.pem’ file in that directory with the core id as the filename, and it’s parsed with ursa.createPublicKey which expects a pub pem file.

1 Like

hi Guys!

I did the command @Dave wrote, but it is still just blinking like crazy in white.

Can i somehow see an error log or is there any meaning of this type of error ā€œmessageā€ of the led?

Kind regards,
Noten

here is the video of the core:

and this is the output of the server:

    Loading user xxx
connect.multipart() will be removed in connect 3.0
visit https://github.com/senchalabs/connect/wiki/Connect-3.0 for alternatives
connect.limit() will be removed in connect 3.0
Starting server, listening on 8080
static class init!
found xxxxxxxxxxxxxxxx
Loading server key from default_key.pem
set server key
server public key is:  -----BEGIN PUBLIC KEY-----
xxxxxxxxxxx
-----END PUBLIC KEY-----

Your server IP address is: 192.168.2.10
server started { host: 'localhost', port: 5683 }

Hi @Noten,

Is your core and server on the same network? Did you write the server key with your server IP address?

Thanks,
David

hi Dave!

Yes, same network.
I wrote the server key from mac (the core is connected to the mac, while the raspberry pi is running separately). The server url http://192.168.2.10:8080 was set in the .spark/spark.config.json file.

Are these ok?

Hi @Noten,

Make sure you build the server key using the IP of your raspberry pi, and not your mac:

spark keys server default_key.pub.pem ip.of.raspberry.pi

:slight_smile:

Thanks,
Davod

hey Dave!

Okay, so i downloaded the default_key.pub.pem file from the server, connected the spark core to the mac in DFU and run this:

spark keys server default_key.pub.pem 192.168.2.10

Then restarted the core, now fast cyan (x2) and fast red (x1) are blinking. :frowning:

I’m monitoring the server as well, but below the server started { host: ā€˜localhost’, port: 5683 } line nothing appears. I’m clueless :frowning:

Hi @Noten,

How is your raspberry pi connected to your network, are you doing it via your Mac’s internet sharing?

hi Dave!

Both mac & raspberry pi are connected via cable to the router. And the same router’s credentials are set on the core (wifi).

Hi @Noten,

Hmm… Well, if you’re not seeing a connection attempt in your server output, then the Core isn’t hitting the right IP address for some reason. Even if the key were wrong, you’d at least see a connection attempt.

Thanks,
David

when i run the command, should i run spark keys server default_key.pub.pem 192.168.2.10

or

spark keys server default_key.pem 192.168.2.10

Somewhere the tutorial says .pub file is required, somewhere i also read default_key.pem only. Which one is the correct?

Hi Noten,

I think it wants the pub pem file, is your copy of the CLI totally up to date, can you upgrade it with npm update -g spark-cli

Thanks,
David

i did the same process again, but now only the green is blinking.

Also is it enough that i just use the default_key.pub.pem file from the spark-server/js folder?

I can’t think of anything else that could go wrong :frowning:

Okay guys, it is working now.

I did the following (in this order):

  • factory reset for the core
  • connected to cloud again
  • flashed the local setup again to the spark core (BUT i used the default_key.pub.der file, instead of the pem)
  • did the cc3000 update and tinker update (as it didn’t connect to wifi at all)
  • done

Not sure which solved the issue (or the combination of more), but now it is working flawlessly.

Thanks for everybody’s help, good stuff :smiley:

Noten

Now just one last thing, any idea how to start the node main.js with the pi when it boots? I would like to start the local cloud when the pi starts (just in case if it reboots, the cores won’t go offline).

Is it possible, coz this would be ace!

Noten

1 Like

Hi @Noten,

I think you want an init.d script of some kind. Here’s a helpful tutorial site:

http://raspberrywebserver.com/serveradmin/run-a-script-on-start-up.html

And I’ll open an issue to include a stock startup script for the server here:

Thanks,
David

2 Likes

That would be ace @Dave!

I tried with crontab, added this line: ā€œ@restart …/node /home/pi/…/main.jsā€ but it generated quite many errors and i can’t fix it :frowning:

Looking forward for your script, let me know if you need testing.

Noten

1 Like

Hey @kennethlimcp thanks for the post and @Dave for the walk-through. However I am facing a problem with the public key. My core_keys folder is empty. I am running the cloud server in a Raspberry Pi (192.168.0.106). And I have the Spark CLI installed in a ubuntu machine (192.168.0.105) which I am using to flash the spark core (192.168.0.102). My core ID is 55ff6b065075555332071787. On the server side it says:

Your server IP address is: 192.168.0.106
server started { host: ā€˜localhost’, port: 5683 }
Connection from: 192.168.0.102, connId: 1
Expected to find public key for core 55ff6b065075555332071787 at /home/pi/spark-server/js/core_keys/55ff6b065075555332071787.pub.pem .

Then the connection times out.

Here’s the steps I followed:
=>> In the Raspberry Pi,
1. Completed the server installation from http://kennethlimcp.gitbooks.io/spark-local-cloud-on-raspberry-pi/install_local_cloud_repo/README.html.

=>> In the Linux machine,
1. Edited the spark.config.json file. Added ā€œapiUrlā€: ā€œhttp://192.168.0.106:8080ā€ (ip of the raspberry-pi)
2. Created an account using $ spark setup and quit before providing any Wifi credentials. Account created.
3. Copied the ā€˜default_key.pub.pem’ file from the server (Rasp-Pi) onto the Linux machine and ran ā€˜spark keys server default_key.pub.pem 192.168.0.106’. Flashed.
4. Next 'spark keys save < Core ID>. However it threw me an error ā€œError saving key… Error: Command failed: invalid dfuse address: 0x00002000:1024ā€. So I ran the dfu-util explicitly ā€œsudo dfu-util -d 1d50:607f -a 1 -s 0x00002000 -U 55ff6b065075555332071787.pub.pemā€ removing the 1024. It uploaded the the keys to 55ff6b065075555332071787.pub.pem file.

I copied 55ff6b065075555332071787.pub.pem to Raspberry-Pi in /spark-server/js/ to /spark-server/js/core_keys/ directory. But it is not working either. Throws the following error :smile:

Your server IP address is: 192.168.0.106
server started { host: ā€˜localhost’, port: 5683 }
Connection from: 192.168.0.102, connId: 1
Caught exception: Error: Not a public key.{}


Can you help me out on this or give the steps when Spark CLI and Local cloud are on different systems.

Thanks.
Gaurav

Is your core in dfu mode when you perform spark keys save core_id ?

1 Like