Spark-Server Setup

So after my great adventures of getting node and npm installed and running on my debian wheezy armhf build…and then cli…and then spark-server, I’m now up to spark-server setup and have hit a couple of snags.

node main.js fires up fine. Put a core in dfu mode. Open another terminal window…fire up the cli and per the docs enter

spark keys server

The return I get however, is

Please specify a server key in DER format.

On the one hand this makes sense, since I wouldn’t expect the cli to know what default_key means, but I’m just following the docs. Then I thought maybe I needed to point my CLI config file at the Spark-Server first (counter to what the docs say), but this brought up the second problem.

~/.spark/ is an empty folder, thus there is no spark.config.json to edit.

Thanks as always.

I’ve had that same error before but I’m at a loss remembering how I fixed it.

I know you can manually create the .der by (on linux/mac… not sure in windows though):

openssl rsa -in -pubin -pubout -outform DER -out

Hi @techbutler

The openssl command from @JackANSI is exactly what is recommended in the notes here:

The spark keys server command is supposed to convert this for you, but you can do it yourself. It could be that this does not work right due to a still hidden installation problem, but you can convert it yourself and try.

The file is found in the main directory containing main.js.

Did you CD to that directory or copied the pem file to your working directory before performing the command?

1 Like

You got me @kennethlimcp that was problem #1… I was not in /spark-server/js when I ran spark keys from the cli. Thanks.

Now as for issue #2… since there was no spark.config.json in ~/.spark/, I went ahead and created it with the language per the docs. Was that ok?

Sounds good! But the better option would be to use the spark config command and create a new profile for your own :cloud:.

It makes things more manageable somehow ;).

See the docs on how to create a new profile or simply reply here if you need more assistance.

Have fun!

1 Like

I’ve been having varying degrees of success with the different steps. The key management steps have seemed to work fine thus far, but the one thing I’m having a problem with is setting up an account. After setting up my profile and changing to that profile and running spark setup, this is the series of error messages I get…

And yes, spark-server is currently running :slight_smile:

ubuntu@udoobuntu: ~/spark-server/js $ spark setup

Setup your account

Could I please have an email address?
and a password?  ********

Trying to login...
login error:  { [Error: connect ECONNREFUSED]
  errno: 'ECONNREFUSED',
  syscall: 'connect' }
Login failed, Lets create a new account!
confirm password  ********
creating user:
createUser got  undefined
login error:  { [Error: connect ECONNREFUSED]
  errno: 'ECONNREFUSED',
  syscall: 'connect' }
Error setting up your core: Login Failed: Error: connect ECONNREFUSED

How did you get Spark-cli to point to your own spark-server?

spark config mylocal

then did a spark config identify to confirm that’s where it was supposed to be and that it had the right IP

Is the IP address http or https? Maybe you can watch the console log and see what’s the error messages and let us know before we can help further. :wink:

Maybe restarting the server is a good move as well.

Well, as is usually the case a very simple solution… it seems the inclusion of port 8080 is a required component in my apiUrl to complete account setup. The tutorial and the docs differ on this point. Should have caught it sooner.

Thanks again.

@techbutler looks like your server is all set; what’s your plan on the dev side of things using your local cloud? it looks like Spark Dev and web IDE both don’t talk to local cloud; is there a decent IDE that DOES work with local cloud (other than coding in the cli environment)?

@bubba198 Have you checked out this thread…

Thanks @techbutler; Yes I did look over the net beans solution. It does answer the “compile locally with IDE” question but the matter of pushing code over-the-air appears elusive. Is it not possible to point net beans to a local cloud server so that new code is delivered over-the-air using local cloud server? I know net beans can be setup to do “spark flash your_core binary.bin” with a GUI button but that means one MUST install the cli in addition to the guide you referenced; could there be a native plug-in to deliver the binary to a local cloud url straight out of net beans?