So after my great adventures of getting node and npm installed and running on my debian wheezy armhf build…and then cli…and then spark-server, I’m now up to spark-server setup and have hit a couple of snags.
node main.js fires up fine. Put a core in dfu mode. Open another terminal window…fire up the cli and per the docs enter
spark keys server default_key.pub.pem 192.168.1.76
The return I get however, is
Please specify a server key in DER format.
On the one hand this makes sense, since I wouldn’t expect the cli to know what default_key means, but I’m just following the docs. Then I thought maybe I needed to point my CLI config file at the Spark-Server first (counter to what the docs say), but this brought up the second problem.
~/.spark/ is an empty folder, thus there is no spark.config.json to edit.
The openssl command from @JackANSI is exactly what is recommended in the notes here:
The spark keys server command is supposed to convert this for you, but you can do it yourself. It could be that this does not work right due to a still hidden installation problem, but you can convert it yourself and try.
I’ve been having varying degrees of success with the different steps. The key management steps have seemed to work fine thus far, but the one thing I’m having a problem with is setting up an account. After setting up my profile and changing to that profile and running spark setup, this is the series of error messages I get…
And yes, spark-server is currently running
ubuntu@udoobuntu: ~/spark-server/js $ spark setup
========================================
Setup your account
Could I please have an email address? what@ever.com
and a password? ********
Trying to login...
login error: { [Error: connect ECONNREFUSED]
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect' }
Login failed, Lets create a new account!
confirm password ********
creating user: what@ever.com
createUser got undefined
login error: { [Error: connect ECONNREFUSED]
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect' }
Error setting up your core: Login Failed: Error: connect ECONNREFUSED
Well, as is usually the case a very simple solution… it seems the inclusion of port 8080 is a required component in my apiUrl to complete account setup. The tutorial and the docs differ on this point. Should have caught it sooner.
@techbutler looks like your server is all set; what’s your plan on the dev side of things using your local cloud? it looks like Spark Dev and web IDE both don’t talk to local cloud; is there a decent IDE that DOES work with local cloud (other than coding in the cli environment)?
Thanks @techbutler; Yes I did look over the net beans solution. It does answer the “compile locally with IDE” question but the matter of pushing code over-the-air appears elusive. Is it not possible to point net beans to a local cloud server so that new code is delivered over-the-air using local cloud server? I know net beans can be setup to do “spark flash your_core binary.bin” with a GUI button but that means one MUST install the cli in addition to the guide you referenced; could there be a native plug-in to deliver the binary to a local cloud url straight out of net beans?