I have been doing this before pre-release and it works. Something is not right then
OR I might have remembered wrongly with so much keys generated
I have been doing this before pre-release and it works. Something is not right then
OR I might have remembered wrongly with so much keys generated
Yeah, that command shouldnāt work.
The code that loads the key is here, it will only pick up a āpub.pemā file in that directory with the core id as the filename, and itās parsed with ursa.createPublicKey
which expects a pub pem file.
hi Guys!
I did the command @Dave wrote, but it is still just blinking like crazy in white.
Can i somehow see an error log or is there any meaning of this type of error āmessageā of the led?
Kind regards,
Noten
here is the video of the core:
and this is the output of the server:
Loading user xxx
connect.multipart() will be removed in connect 3.0
visit https://github.com/senchalabs/connect/wiki/Connect-3.0 for alternatives
connect.limit() will be removed in connect 3.0
Starting server, listening on 8080
static class init!
found xxxxxxxxxxxxxxxx
Loading server key from default_key.pem
set server key
server public key is: -----BEGIN PUBLIC KEY-----
xxxxxxxxxxx
-----END PUBLIC KEY-----
Your server IP address is: 192.168.2.10
server started { host: 'localhost', port: 5683 }
Hi @Noten,
Is your core and server on the same network? Did you write the server key with your server IP address?
Thanks,
David
hi Dave!
Yes, same network.
I wrote the server key from mac (the core is connected to the mac, while the raspberry pi is running separately). The server url http://192.168.2.10:8080 was set in the .spark/spark.config.json file.
Are these ok?
Hi @Noten,
Make sure you build the server key using the IP of your raspberry pi, and not your mac:
spark keys server default_key.pub.pem ip.of.raspberry.pi
Thanks,
Davod
hey Dave!
Okay, so i downloaded the default_key.pub.pem file from the server, connected the spark core to the mac in DFU and run this:
spark keys server default_key.pub.pem 192.168.2.10
Then restarted the core, now fast cyan (x2) and fast red (x1) are blinking.
Iām monitoring the server as well, but below the server started { host: ālocalhostā, port: 5683 } line nothing appears. Iām clueless
Hi @Noten,
How is your raspberry pi connected to your network, are you doing it via your Macās internet sharing?
hi Dave!
Both mac & raspberry pi are connected via cable to the router. And the same routerās credentials are set on the core (wifi).
Hi @Noten,
Hmm⦠Well, if youāre not seeing a connection attempt in your server output, then the Core isnāt hitting the right IP address for some reason. Even if the key were wrong, youād at least see a connection attempt.
Thanks,
David
when i run the command, should i run spark keys server default_key.pub.pem 192.168.2.10
or
spark keys server default_key.pem 192.168.2.10
Somewhere the tutorial says .pub file is required, somewhere i also read default_key.pem only. Which one is the correct?
Hi Noten,
I think it wants the pub pem file, is your copy of the CLI totally up to date, can you upgrade it with npm update -g spark-cli
Thanks,
David
i did the same process again, but now only the green is blinking.
Also is it enough that i just use the default_key.pub.pem file from the spark-server/js folder?
I canāt think of anything else that could go wrong
Okay guys, it is working now.
I did the following (in this order):
Not sure which solved the issue (or the combination of more), but now it is working flawlessly.
Thanks for everybodyās help, good stuff
Noten
Now just one last thing, any idea how to start the node main.js with the pi when it boots? I would like to start the local cloud when the pi starts (just in case if it reboots, the cores wonāt go offline).
Is it possible, coz this would be ace!
Noten
Hi @Noten,
I think you want an init.d
script of some kind. Hereās a helpful tutorial site:
http://raspberrywebserver.com/serveradmin/run-a-script-on-start-up.html
And Iāll open an issue to include a stock startup script for the server here:
Thanks,
David
That would be ace @Dave!
I tried with crontab, added this line: ā@restart ā¦/node /home/pi/ā¦/main.jsā but it generated quite many errors and i canāt fix it
Looking forward for your script, let me know if you need testing.
Noten
Then the connection times out.
Hereās the steps I followed:
=>> In the Raspberry Pi,
1. Completed the server installation from http://kennethlimcp.gitbooks.io/spark-local-cloud-on-raspberry-pi/install_local_cloud_repo/README.html.
=>> In the Linux machine,
1. Edited the spark.config.json file. Added āapiUrlā: āhttp://192.168.0.106:8080ā (ip of the raspberry-pi)
2. Created an account using $ spark setup and quit before providing any Wifi credentials. Account created.
3. Copied the ādefault_key.pub.pemā file from the server (Rasp-Pi) onto the Linux machine and ran āspark keys server default_key.pub.pem 192.168.0.106ā. Flashed.
4. Next 'spark keys save < Core ID>. However it threw me an error āError saving key⦠Error: Command failed: invalid dfuse address: 0x00002000:1024ā. So I ran the dfu-util explicitly āsudo dfu-util -d 1d50:607f -a 1 -s 0x00002000 -U 55ff6b065075555332071787.pub.pemā removing the 1024. It uploaded the the keys to 55ff6b065075555332071787.pub.pem file.
Your server IP address is: 192.168.0.106
server started { host: ālocalhostā, port: 5683 }
Connection from: 192.168.0.102, connId: 1
Caught exception: Error: Not a public key.{}
Can you help me out on this or give the steps when Spark CLI and Local cloud are on different systems.
Thanks.
Gaurav
Is your core in dfu mode when you perform spark keys save core_id
?