HOW TO: Spark Local Cloud on Google Cloud Compute

Hi,
P.S. I’ve posted this question here and so far I didn’t get any help. Hence, I’m posting this under the General category hoping to get some help.

I’m trying to setup the spark cloud on my google cloud compute. I’ve been following this tutorial to setup the local cloud. I’m having problems getting the local cloud to work on my google cloud environment. Here is what I’ve tried so far:

I nearly got it worked but stuck at this error:

Your server IP address is: 10.x.x.x
server started { host: 'localhost', port: 5683 }
42.x.x.x - - [Wed, 01 Apr 2015 02:46:56 GMT] "POST /v1/provisioning/53xxxxxxxxxxxxxxxxx HTTP/1.1" 400 109 "-" "-"

Here is what I’ve tried on my Google cloud server so far:

$ mkdir spark-core
$ cd spark-core
$ wget https://launchpad.net/gcc-arm-embedded/4.8/4.8-2014-q2-update/+download/gcc-arm-none-eabi-4_8-2014q2-20140609-linux.tar.bz2
$ tar xvjpf gcc-arm-none-eabi-4_8-2014q2-20140609-linux.tar.bz2
$ export PATH=$PATH:$HOME/src/spark-core/gcc-arm-none-eabi-4_8-2014q2/bin
$ git clone https://github.com/spark/core-firmware.git
$ git clone https://github.com/spark/core-common-lib.git
$ git clone https://github.com/spark/core-communication-lib.git
$ git clone https://github.com/spark/spark-server.git
$ npm install -g spark-cli (need either root or sudo)
$ cd spark-server
$ npm install
$ node main.js

On my local PC with dfu-util installed, here is the following done on my PC:

$ git clone https://github.com/spark/spark-server.git
$cd spark-server
$node main.js

While my local server (on my PC) is running I then did the following on a new terminal window:

$ spark config googleCloud apiUrl http://mydomainname.com:8080
$ spark config googleCloud
$ cd spark-server
$ spark keys server default_key.pub.pem mydomainname.com

I then did this on my local server (PC with DFU-UTIL):

$cd core_keys
$sudo spark keys doctor 53fxxxxxxxxxxxxxxxxxxx

I end up with the following error message when I try to upload the keys on the spark (Spark is in DFU mode). The below DFU message was too long. I am just pasting the error message portion of it.

File downloaded successfully
Transitioning to dfuMANIFEST state
Error during download get_status
Saved!
attempting to add a new public key for core 53xxxxxxxxxxxxxxxx
*********************************
      Please login - it appears your access_token may have expired
*********************************
submitPublicKey got error:  invalid_grant
Make sure your core is in DFU mode (blinking yellow), and that your computer is online.
Error - invalid_grant

On my google cloud server, I got the following error:

Your server IP address is: 10.x.x.x
server started { host: 'localhost', port: 5683 }
42.x.x.x - - [Wed, 01 Apr 2015 02:46:56 GMT] "POST /v1/provisioning/53xxxxxxxxxxxxxxxxx HTTP/1.1" 400 109 "-" "-"

What am I doing wrong? Any help would be highly appreciated.

i’m not sure if spark keys doctor works for for the spark-server.

Try placing the core public key into the core_keys directory and restart the server.

If you are using the tutorial i wrote, try to follow it close step by step.

Also, be sure that Spark-cli is pointing to the local :cloud: profile. Check using spark config identify

Try not to do a cross-post. I usually reply questions here promptly but missed that thread. You can use @kennethlimcp to ping me in future posts.

@kennethlimcp, thanks for the reply.

Try placing the core public key into the core_keys directory and restart the server.

Tried. Didn’t work.

If you are using the tutorial i wrote, try to follow it close step by step.

Yes. I’ve followed it very closely. Please see the steps that I’ve posted in this thread.

Also, be sure that Spark-cli is pointing to the local cloud profile. Check using spark config
identify

Yes. The spark-cli is pointing to local cloud profile.

Did you swap the keys on your core to the google cloud compute key? It should at least reach breathing cyan and do rapidly flashing red/yellow due to bad core keys if that happened to be wrong.

Try looking at the log to see if the core managed to connect.

@kennethlimcp, yes, I did swap the keys. And yes, I did reach the stage where the core was lighted up in cyan and blinking fast then going to red blinking. When this happens, I can see the log on my google cloud saying the following:

42.x.x.x - - [Wed, 01 Apr 2015 02:46:56 GMT] "POST /v1/provisioning/53xxxxxxxxxxxxxxxxx HTTP/1.1" 400 109 "-" "-"

Looks good. Can you perform a spark keys save CORE_ID in DFU mode and upload the CORE_ID.pub.pem file to the server and restart it again?

That doesn’t look like the output for the core. More like you sending an API request…

If the core connected correct it should be:

Connection from: 192.168.1.159, connId: 1
on ready { coreID: '48ff6a065067555008342387',
  ip: '192.168.1.159',
  product_id: 65535,
  firmware_version: 65535,
  cache_key: undefined }
Core online!

Make sure you did this: spark keys server your_cloud.der DOMAIN_NAME/IP

@kennethlimcp, this is what I did:

On my google cloud,

spark config google apiUrl http://mygoogledomain.com:8080
spark config google
spark config identify

This confirms that spark-cli is pointing to mygoogledomain.
Then on my google cloud,

node main.js

This started the spark server on my google cloud and generated the default pem files. I copied that to my local PC which was also configured to:

spark config google apiUrl http://mygoogledomain.com:8080
spark config google
spark config identify

The on my local PC, I started the local spark server (on my PC using:

node main.js

At this point, the local server (on my PC) has the following located under the following folders:
spark-server/default pem keys (copied from the google cloud server)
spark-server/core_keys/ default pem keys (copied from the google cloud server)

On my local PC, navigated to spark-server/core_keys, then put my core in DFU mode, then used the below command

openssl genrsa -out core.pem 1024
openssl rsa -in core.pem -pubout -out core_public.pem
openssl rsa -in core.pem -outform DER -out core_private.der
dfu-util -d 1d50:607f -a 1 -s 0x00002000 -v -D core_private.der
sudo spark keys doctor sparkID

I then copied all the key files from local PC spark-server/core-keys to google compute spark-server/core-keys and restarted the server:
Once I did it and tried spark setup, it would try to connect and cyan blinks for a while then turns red. I see the following error on the google compute:

42.x.x.x - - [Wed, 01 Apr 2015 02:46:56 GMT] "POST /v1/provisioning/53xxxxxxxxxxxxxxxxx HTTP/1.1" 400 109 "-" "-"

i’m confused. Do you want to use your local spark server or the google compute cloud?

Let me repeat one more time that spark keys doctor will not work. This is what i am saying about following closely… I mentioned this above:

1.) Follow the instructions and install spark-server in Google compute cloud
2.) Copy the default_keys.pub.pem from the Google compute cloud and save it in your laptop
3.) Using spark keys server default_keys.pub.pem domain_name, flash this to the core in DFU mode
4.) Using spark keys save core_id, extract the core public key and save it to core_keys directory in the Google compute cloud`
5.) Fire up the server and connect the core to the interenet.
6.) Done

@kennethlimcp, Sorry for the confusion. I was lost too. My intention is to use Google Compute Cloud. Since I cannot DFU-UTIL from google cloud, I used local PC to do that. I guess that portion was the confusion. And hence I posted my concerns here to see what am I doing wrongly.

At step 3,

3.) Using spark keys server default_keys.pub.pem domain_name, flash this to the core in DFU mode

I get the following error:

Please specify a server key in DER format.

You need DFU-util and Spark-cli on the machine that the core is connected to via USB. There is no way you can connect a core using a physical USB cable to Google cloud compute…

Sorry please use the .der file instead

@kennethlimcp,

You need DFU-util and Spark-cli on the machine that the core is connected to via USB. There is no way you can connect a core using a physical USB cable to Google cloud compute.

Yes, I know that. I’ve DFU-UTIL on my laptop. Not a problem.

Sorry please use the .der file instead

There is no .der file. Upon starting the server on google compute cloud, I only get the default_key.pem and default_key.pub.pem files. Should I generate one on my google compute cloud?

When I perform the following on my local machine, I could perfectly connect to the Spark Cloud server. However, when I try to replicate the same for my google compute, it fails. The following are done on my local PC for testing purpose. However, I couldn’t get the same to work if I replace the spark cloud certificate with my google compute generated certificate.

openssl genrsa -out core.pem 1024
openssl rsa -in core.pem -pubout -out core_public.pem
openssl rsa -in core.pem -outform DER -out core_private.der
dfu-util -d 1d50:607f -a 1 -s 0x00002000 -v -D core_private.der

grab this file https://s3.amazonaws.com/spark-website/cloud_public.der
dfu-util -d 1d50:607f -a 1 -s 0x00001000 -v -D cloud_public.der

spark login
spark keys send your_core_id core_public.pem

My original instructions says spark keys server default_key.pub.pem IP_ADDRESS so i think the .pub.pem file should be there.

Can you copy it to your machine and perform the spark keys server default_key.pub.pem mydomainname.com

1 Like

@kennethlimcp, done! The cyan now blinks rapidly and then switches to red on the core. It repeats this over and over.

@kennethlimcp, should I open up any ports on my google compute cloud? I’ve already opened Port 80, 8080 for tcp. I’ve an IP address like 133.xxx.xxx.xxx - This represents my static IP. I’ve setup DNS records to point this static IP to mydomain.com. When I trigger the spark server, it shows a different IP address which is 10.xxx.xxx.xxx

On my spark cli, I used spark keys server default_key.pub.pem mydomain.com. Is this correct way?

On my spark config apiUrl http://mydomain.com:8080 is provided too.

You will need port 5683 for COAP protocol. So i presume your Spark-cli is able to login to your :cloud:?

Also, did the core managed to reach breathing cyan?

@kennethlimcp, I am just opening the UDP port 5683 on my google compute. But before that, I managed to handshake from my local PC to my spark google cloud server by creating a new user. However, breathing cyan is not yet achieved. I see a fast blinking cyan (the state before reaching the breathing cyan), then i get a red light. Let me try with the port open and see if that helps. I’m nearly about to give up on this :cry:. Thanks for helping out. Will update shortly after testing.

@kennethlimcp, still same problem even after opening the UDP 5683 port on my google compute server. Tell me this, should I or should I not create the following on my google compute cloud:

spark config google apiUrl http://mydomain.com:8080

When I create this profile on my google compute cloud and try spark config identify, it successfully shows my domain address as the API, however it also shows Access token: null. However, on my local PC, it just shows the API address and no issues about Access token.

@sriram155,

you might want to start from scratch.

Let me emphasize again that Spark-cli and DFU-util is not required.

Here’s the setup:

Server —> Spark-server

Laptop --> Spark-cli, DFU-util

@kennethlimcp, this is my 7th time doing from scratch without success :smile: . Anyways, thanks for your help. I will try it one last time and if it doesn’t work, I will keep my spark aside and move on.

Any particular reason you need your own :cloud:. Try setting up on your own laptop instead. It’s much easier to debug…