Local Cloud keys error

I followed the tutorial but i can not make my core connect successfully to the local server .
It is blinking cyan in the final.

Anyone had luck with that ?

@Gentmat, local core claiming does not work yet. You missed the part in the instructions about hitting “CTL C” and skipping the wifi credentials part (part 3). Follow the exact steps and you should have no problems. :smile:

1 Like

I am using mac, When i wrote Spark setup and login … It asked me wifi credentials and i did put it . Please look at the video . i put my wifi SSid and my password

If you meant after when i put the ssid , i should press ctr-c and continue writing command there . i did that and i got red flash

@Gentmat,

When you get to the part with Spark setup where it asks to setup the wifi, you hit CTL-C to abort that part. Then follow the instructions from there.

I did so ! now the spark is blinking blue ( how would it know the wifi credentials)
we skiped this step by ctr-c !!!

I’m realizing we need this to make the setup experience better, I’ll stub this in.

Thanks!
David

so ive been trying 7 hours to connect and couldnt . how did other do it ??

@Gentmat,

the issue i see is the core not being able to reach the server running off your Macbook pro.

can you share more about your Wifi network? There seems to be some “wall” preventing the core from reaching your MBP

2 Likes

WPA2 Personal … Its a time capsule (Apple) router .
I opened this port 5683 for tcp and udp so there is no way of firewall problem.

It has 2.4 and 5 GHZ … i am using the 2.4ghz for sure on the mac

What else can i do !! I got 2 cores and i tries with both many times
Did anyone tried this tutorial and worked ??

Hi @Gentmat,

The biggest hurdle here is just making sure your cores have your server key with the correct IP address, and your server has your core public keys in the ‘core_keys’ folder, and that the filename is the same as your core id. If your core is blinking red, either it isn’t hitting your server at all, or the keys are wrong. Can you post the output from your server after it starts up and your core is trying to connect?

Thanks,
David

@Dave

The server just write
Your server IP address is: 10.0.1.100
server started { host: ‘localhost’, port: 5683 }

Thats it
If i open another cmd and write
spark list
this will output on server
ListDevices { userID: ‘z4huFYQZlu6yH0PLzQipMRHd1fR1ioqj’ }
ListDevices… waiting for connected state to settle  { userID: ‘z4huFYQZlu6yH0PLzQipMRHd1fR1ioqj’ }
10.0.1.100 - - [Fri, 11 Jul 2014 07:08:04 GMT] “GET /v1/devices?access_token=5d6907530e47a1a23bae7e105367ae347d049e37 HTTP/1.1” 200 2 “-” “-”

And on the cmd where i worte spark list it will show
Checking with the cloud…
Retrieving cores… (this might take a few seconds)
No cores found.
Gentmats-MacBook-Pro:~ gentmat$

Note : Someone Told me when i write
spark setup and create the mail login
press ctr-c and not to write wifi credentials .
So after all process is done i am entering the wifi credential with the spark core app .

The core is blinking cyan ( NOT breathing cyan )

@Gentmat,

add me on google chat and we will troubleshoot live. :wink:

Yay thanks but i dont have your gmail, btw ill install team viewer too in case you like that :wink: thanks

Hey :spark: community!

Just wondering with anyone got the Local cloud running with an Apple Router?

@Gentmat,

can you try this?

I have same problem with my windows machine.I find the solution in this way;

" when I press ctrl + c while in the console then core connected to router."

Can you try this ? (cmd+c with mac)

NAT Port Mapping Protocol was always on .
{ and to be sure i made it DMZ (host default) }

when I press ctrl + c while in the console then core connected to router. When is that can you be specific please

i will rent a dedicated mac server from online company . im sure they will have open ports and i will try it out. do u guys think its a good idea ?

@Gentmat, if you have a Raspberry Pi, you can use it too. Don’t need to rent :wink:

Heya @Gentmat,

Ahh, okay, thanks for posting the output. I think the issue is Spark keys save is a little misleading, you can’t use that to get your core’s public key for the local cloud. The key saved on your Core is a private key in DER format, so you’d need to convert that into a pub pem. The command line also doesn’t know how big the key is, so it pulls 1024 bytes just to be safe. It was mostly meant as a way to backup your key in case something went wrong.

Thanks,
David