WebSocket handShake not working

Hi All !

I am trying to connect my sparkcore to a local node.js websocket server.
Both the sparkcore and the server are connected to the same routher but when I use the sparcore sketch example for WebSocket and call my server this way client.connect(server,8001); I always get Connection Failed in the serial monitor.

Is there a way to connect the sparkcore socket example with my local websocket server or I need to set up a local sparkcore clound in order to do that ?

Can you provide a link of the library and your code ?

My code : http://pastebin.com/Cv6Jk4md

The library : https://github.com/ekbduffy/spark_websockets

@kennethlimcp any suggestions ?

I guess you changed this line :char server[] = "any server";?

@kennethlimcp Yes, before posting the code here, I changed it !

Have you tested the WebSocket server that you setup using other means to ensure that you are able to connect and it works?

Yeah tested it locally with a WebSocket in javascript and it’s working
@kennethlimcp Are you sure about that spark core can access my server locally from my network ?

Yes definitely. We use stuff like TCP and it works fine on a local server.

Can you paste the actual code here using the format:


@bko should be able to help once you post the code :slight_smile:

Hi @mitko29

As @kennethlimcp said, this line could be a problem. Was it previously something like:

char server[] = "";

Because that cannot work on Spark--the DNS client will not resolve a dotted IP address as a host name. If that is it, you should use IPAddress(10,0,0,8) instead. There is another option using a remote DNS service but this the easiest way.

1 Like

@bko Correct me if I am wrong but I have to use like this :smile:

IPAddress server(8,8,8,8); how to add the server as parameter to the sparkcore websocket library, if you check inside the connect method I have const char hostname[] ?

Sorry I don’t know this library–it looks like you can use hostname or byte array address. IPAddress is overloaded to supply a byte array as needed for compatibility with this older way of writing IP addresses.

 // from the library
 void connect(const char hostname[], int port = 80, const char protocol[] = NULL, const char path[] = "/");
 void connect(const byte host[], int port = 80, const char protocol[] = NULL, const char path[] = "/");

You can to this:

  uint8_t server[] = {8,8,8,8};
  client.connect(server, port);

or you can do this:

  client.connect(IPAddress(8,8,8,8), port);

Well actually inside the library .cpp file there is no implementation for

void connect(const byte host[], int port = 80, const char protocol[] = NULL, const char path[] = "/");

Also as far as I can see the library in the reconnect method .cpp is having a condition to determine if that is _ip or _hostname

@bko Any other suggestions ?

I guess you need help from @ekbduffy to fix his library then.

If it were me, I would fork it and fix it for my needs. You can always send him a pull request.

Recently another user pointed out a DNS hack service called xip.io that reflects host lookups back to your local subnet. You could try that with a hostname like “” which resolves to

I am not very sure what I need to change because I am not familiar with the library and the way it works
@bko I found out that inside the spark core IDE it’s using a older version of the websocket library than it is inside github ! -> Fixed that
Now I found out that even the test exmaple is not working

Ok so I managed to make it work and connects with my local WebSocket server but I experienced several problem like:
1.If i don’t wait for any serial input to continue my program the spark almost never connect to the server.
2.If it’s connected and start sending messages every 3 seconds after the 4th message the spark unit start to blink in red SOS signal
3.If I stop the messages for interval higher than 10 seconds the spark core unit restarts

@bko Any light on that ?

Hi @mitko29

Several folks are reporting that waiting in setup() for a serial port character (which is required when using a PC to monitor the USB serial messages) helps them. I don’t know what this is.

Your other two problems are classic symptoms of not managing the TCP connection properly either in your code or the library. On the Spark Core there two sets of buffers for received data, one in the TCP client and another in the TI Wifi CC3000 chip. Not reading all the data out of the TI CC3000 causes the kind of problems you are reporting since when the your core goes to use its TCP connection to the cloud, all the resources in the WiFi chip are tied up.

I would try to check that the library or your code reads all the data out the connection. Merely calling flush() is not sufficient.

1 Like