I’m running into issues with the TCPClient. I have some code that monitors a switch, when the switch is pressed, I connect to a local server, create a HTTP Request, read the answer and then close the connection.
By default this works for the first 10-20 times, but after a while the core hangs ( probably in connect() ).
I’ve seen others having this issue as well.
I’ve found a workaround though and that is calling
SYSTEM_MODE(MANUAL);
WiFi.on();
but not connecting to the Spark cloud. As long as I don’t connect to the Spark cloud everything works fine. As soon as I connect, I get the hangs again.
For now I just don’t connect to the cloud and I’ve put in a button combination to connect to the cloud so I can still flash via the WebUI when I need to.
I have the Spark core joining the I2C Bus as a slave. The button presses are actually monitored by an Arduino I2C Master that sends the button presses to the Spark Core, where I’m using Wire.onReceive(receiveEvent); to receive the events.
The receiveEvents does not initiate the TCPClient calls though, I just store the information and call TCPClient stuff from Loop.
You need to make sure you are handling all the data returned from the webserver, by waiting for data and then reading it until it is all gone. You cannot depend on client.flush() to handle this for you since that only clears the client buffer while the TI part is still buffering packets for you.
If I am right about what I think is going on, you will find that turning off the cloud extends the time before failure but does not completely eliminate it. By turning off the cloud, you are giving the TI part more TCP packet buffers to use.
TCPClient can be very reliable but it requires some care and feeding.