Can't do handshake when spark core connecting to my local spark server?

Today I tried to install the spark-server into my local mac, I followed the instruction from spark-server installation guide.

  1. firstly i successfully loaded the local server public key and IP address into the core
  2. in the second step, it asked to change the spark.config which is not found, so I tried the spark config command.
    $ spark config local-spark
    $ spark config local apiUrl http://localhost:8080

I truly don't know what this step is for?

3, in the third step, I run spark identify to get my core id, this is ok
4. in the last step, I tried the spark setup connecting the core to wifi, and core will also connect to the local server automatically, but the core is flashing the cyan fast, meaning can't connect the local server.

so I went to the server logs, found this error message

Connection from: 192.168.43.223, connId: 30
1: Core disconnected: read_coreid timed out { coreID: 'unknown', cache_key: '_29' }
Session ended for _29
Handshake failed: read_coreid timed out { ip: '192.168.43.223', cache_key: '_29', coreID: null }

and

Connection from: 192.168.43.223, connId: 1
Expected to find public key for core 51ff70065082554910410887 at /Users/abc/spark.io/spark-server/core_keys/51ff70065082554910410887.pub.pem

The failed reason is because the core_keys folder is empty with no keys at all. Is there any step I was doing wrong or missing, how to move the public key of the core into the server side?

The second question is as now my spark core stores the local server public key and IP address in the flash, if I want it to work with spark cloud again, how could I restore it back?

You might want to be more familiar with Spark-cli to know what is going on.

  • spark config local apiUrl http://localhost:8080

This is creating a new profile for the ease of switching around different :cloud:

  • spark config identify

This will print out the settings of the current profile for verification purposes so that you know you are pointing to the :cloud that you want to work with.

  • Expected to find public key

You will need to place the core public key in that folder.

Might want to check this out: https://community.spark.io/t/tutorial-local-cloud-1st-time-instructions-20-march-15/5589

1 Like

Thanks, @kennethlimcp, it works so good.

One question, if I am switching to the lcoal cloud, it means the over-the-air programming is disabled, right. Is there any other ways to compile and program the core in the local environment?

OTA is working on local :cloud:, just not compiling

Try this:

spark config spark; spark compile .; spark config local_profile; spark flash CORE_name *.bin

This will switch to :spark: profile, compile using the :spark: build farm, switch over to your own :cloud: and flash via OTA.

If you want a pure local compiling function, you will need to setup a toolchain to do so.

1 Like