Local cloud broke after Cassandra installation

Hi Team,

I have my local cloud working on Ubuntu 14.1 for last couple of months but all of sudden after installing Cassandra on the server. It did install Java prior to installing Cassandra.

I stopped the services of Cassandra but still my cores are not connecting to the cloud.

The local cloud says it is listening on 8080 but somehow the cores are not connecting. I have 4 cores and all of them are just blinking CYAN but not stable.

Please let me know if you have any suggestions for this.


It was because of Node 0.12 and spark-server does not support that it has an issue. Even though I reverted back my node.js to 0.10.30 , my cores are not connecting to local cloud ( My Digital OCean server).

it is flashing CYAN. I tried all the possible ways but no SUCCESS !

Need help !


Hi @satendra4u,

Just as a sanity check, can you remove the node_modules folders from the installation of the server / protocol module, and re-run the npm install command? I’m curious if the building of the dependencies in Node 0.12 is the issue. If you uninstalled / reinstalled node, you might also need to remove / re-install the dependencies.


Thanks Dave. Let me do it right now.

Hi dave,

I followed the steps below but no success. Still the same issue. The other clue that particle CLI ( local to cloud) is working but the core HAND-SHAKE is not happening.

Is there any difference for spark-server for node 0.10.30 vs 0.10.36? I do not think so.


Make sure you update the core keys using the public key from the new server key. Be sure to use the latest particle-cli as well.

Command reference here: https://github.com/spark/particle-cli#particle-keys-server-ip_address

Hi Ken,

I tired but same thing. The core is not connecting to local cloud.

I did install Soark-server twice, node.js 0.10.30 on digital ocean server as well on my MAC machine too.


Look at the logs when you perform a node main.js what is the output?

Hi Ken,

This is what I get when I run main.js

MoreOver when I fire it does not show any log but when I do netstat I do see that spark-server is listening on port 8080 but it does not handshake with any external request.

Please let me know.


It doesn’t look like the server is running when i tried to access it.

The port might be blocked somehow

it appears to me that port is blocked from outside to inside becasue if I fire

Particle list – it does create a log:

please see below

So that’s not a local server issue and you have to resolve your server settings instead.