Tutorial: Local Cloud 1st Time instructions [01 Oct 15]

installed it but have not install Visual C++ 2010 Express as school internet dont allow me =.=

Hi,
I’m trying to setup local cloud on Google compute. I’m having problems with dfu-util. I’m unable to dfu-uitl from the google cloud to create and transfer the certificates to the Spark. Has anyone tried setting up local cloud server other than localhost and/or on raspberry pi’s? If so, was it successful? Please share some instructions on how to setup local cloud service on cloud servers like Amazon EC2 or Google cloud compute or Microsoft Azure.

You dont need dfu-util to run on the cloud infrastructure. Install it on your own machine instead

@Jeffery why? r u in china ?

https://github-windows.s3.amazonaws.com/GitHubSetup.exe

ermmm no?

how do i install it? where do i cd it to

So how do i generate the keys then? Generate them locally but pointing to the google cloud IP and then transfer the files via FTP to my Google cloud/spark-core/spark-server/core_keys ??

I nearly got it worked but stuck at this error:

Your server IP address is: 10.x.x.x
server started { host: 'localhost', port: 5683 }
42.x.x.x - - [Wed, 01 Apr 2015 02:46:56 GMT] "POST /v1/provisioning/53xxxxxxxxxxxxxxxxx HTTP/1.1" 400 109 "-" "-"

Here is what I’ve tried on my Google cloud server so far:

$ mkdir spark-core
$ cd spark-core
$ wget https://launchpad.net/gcc-arm-embedded/4.8/4.8-2014-q2-update/+download/gcc-arm-none-eabi-4_8-2014q2-20140609-linux.tar.bz2
$ tar xvjpf gcc-arm-none-eabi-4_8-2014q2-20140609-linux.tar.bz2
$ export PATH=$PATH:$HOME/src/spark-core/gcc-arm-none-eabi-4_8-2014q2/bin
$ git clone https://github.com/spark/core-firmware.git
$ git clone https://github.com/spark/core-common-lib.git
$ git clone https://github.com/spark/core-communication-lib.git
$ git clone https://github.com/spark/spark-server.git
$ npm install -g spark-cli (need either root or sudo)
$ cd spark-server
$ npm install
$ node main.js

On my local PC with dfu-util installed, here is the following done on my PC:

$ git clone https://github.com/spark/spark-server.git
$cd spark-server
$node main.js

While my local server (on my PC) is running I then did the following on a new terminal window:

$ spark config googleCloud apiUrl http://mydomainname.com:8080
$ spark config googleCloud
$ cd spark-server
$ spark keys server default_key.pub.pem mydomainname.com

I then did this on my local server (PC with DFU-UTIL):

$cd core_keys
$sudo spark keys doctor 53fxxxxxxxxxxxxxxxxxxx

I end up with the following error message when I try to upload the keys on the spark (Spark is in DFU mode). The below DFU message was too long. I am just pasting the error message portion of it.

File downloaded successfully
Transitioning to dfuMANIFEST state
Error during download get_status
Saved!
attempting to add a new public key for core 53xxxxxxxxxxxxxxxx
*********************************
      Please login - it appears your access_token may have expired
*********************************
submitPublicKey got error:  invalid_grant
Make sure your core is in DFU mode (blinking yellow), and that your computer is online.
Error - invalid_grant

On my google cloud server, I got the following error:

Your server IP address is: 10.x.x.x
server started { host: 'localhost', port: 5683 }
42.x.x.x - - [Wed, 01 Apr 2015 02:46:56 GMT] "POST /v1/provisioning/53xxxxxxxxxxxxxxxxx HTTP/1.1" 400 109 "-" "-"

What am I doing wrong? Any help would be highly appreciated.

After I ran this command:

particle keys server default_key.pub.pem 192.168.1.23

I get this from the spark-server output:

Connection from: ::ffff:192.168.1.29, connId: 1
1: Core disconnected: plaintext was the wrong size: 214 { coreID: 'unknown', cache_key: '_0' }
Session ended for _0

Any ideas why this is happening? Thanks

UPDATE: I found this link via Google. Is this related?

Yup…

Two helpful bits of information I learned after installing spark-server (local cloud) on Mac OS X was that the version of Node.js needs to be 0.10.X (as of July 2015). When I first installed node, I grabbed the latest which was 0.12.7 and is currently not supported.

Also if you get Crypto errors, make sure to delete the node_modules folder and re-run npm install. The server started just fine after that.

Hope that helps.

I am trying to revert back to the particle cloud, but I am facing a strange issue.
As mentioned in the OP, I downloaded the cloud public key file and run

spark keys server your_local_cloud_public_key.der IP-ADDRESS

now whenever I try to particle setup and login my device, after flashing cyan really quick, the color turns yellow and it never gets to the point where it 'breathes' cyan.

I also tried re-installing particle-cli after removing it completely, and also factory reseting my Core. Any ideas?

EDIT: For anyone facing the same problem with me, have a look here. It's not local cloud related.

I am doing every the way you explained but I get the following response on server terminal :

Your server IP address is: 192.168.1.38
server started { host: ‘localhost’, port: 5683 }
192.168.1.38 - - [Thu, 01 Oct 2015 22:05:07 GMT] “POST /v1/devices HTTP/1.1” 400 109 “-” “-”

Hi @kennethlimcp.. Thanks for the amazing tutorial.. I did manage to get my local cloud up and running and so far everything seems to be working fine... The only issue that I'm facing is that when from Particle CLI, when I logout and say yes to revoking the current authentication token, I get an error on the server as below:

TypeError: Object function (options) {
this.options = options;
} has no method 'basicAuth'
at Object.AccessTokenViews.destroy (/local/spark-server/lib/AccessTokenViews.js:59:38)
at callbacks (/local/spark-server/node_modules/express/lib/router/index.js:164:37)
at param (/local/spark-server/node_modules/express/lib/router/index.js:138:11)
at param (/local/spark-server/node_modules/express/lib/router/index.js:135:11)
at pass (/local/spark-server/node_modules/express/lib/router/index.js:145:5)
at Router._dispatch (/local/spark-server/node_modules/express/lib/router/index.js:173:5)
at Object.router (/local/spark-server/node_modules/express/lib/router/index.js:33:10)
at next (/local/spark-server/node_modules/express/node_modules/connect/lib/proto.js:193:15)
at next (/local/spark-server/node_modules/express/node_modules/connect/lib/proto.js:195:9)
at Object.handle (/local/spark-server/node_modules/node-oauth2-server/lib/oauth2server.js:104:11)

I tried searching for answers but couldn't find any... Any ideas on what might be causing this???

Thanks in advance.. .

Hmmm i recall the local cloud doesn’t have the concept of multi users so logout and login is not really applicable. But that might not be the actual case…

@kennethlimcp… One of my team members helped me out… The issue was in the way destroy function was calling basicAuth function within AccessTokenViews.js.

As soon as you change the line var credentials = AccessTokenViews.basicAuth(req) to var credentials = this.basicAuth(req) in AccessTokenViews,js and restart the server, it all works fine…

And so far it looks like the local cloud does support multiple users but only further testing would confirm this… I’ll keep you posted with the results of my multi user testing…

1 Like

Can you submit a PR to benefit the community at large? :slight_smile:

Hi @kennethlimcp…I might sound stupid as I’m new at this… What’s a PR??

Basically to submit the fix to the original github repo at: https://github.com/spark/spark-server

Maybe i will do it for this issue. :wink:

Thanks @kennethlimcp