RESOLVED: TCP packets retransmitting and failing after approximately 25 seconds

Hello,

I am using the Spark Core to communicate via a TCP socket in a local Wi-Fi network. I’m sending voltage values from 3 analog pins to my TCP server 5 times per second. These values are placed in a byte array and then written to the TCP socket. I utilized the spark_disable_cloud class and was able to get my setup working fine until yesterday.

Yesterday I was testing my configuration and noticed that the information transmitted from the Spark Core disappeared after about 25 seconds of successful transmission. I used Wireshark to view the TCP packets sent from the Spark Core to my computer and saw normal packet transmission until about 25 seconds as mentioned. At this point I see the following:

It seems like the Spark Core is looking for an ACK packet and not finding it I guess? The ACK for each TCP packet can be seen in Wireshark with a correct, matching sequence number. As seen in the image, this packet retransmission continues for a while. This activity stops after about 50 duplicate ACK’s have been sent by the TCP server. All packet activity stops at this point.

It’s worth noting that every time I tried to send data, the TCP packets would fail at roughly the same time. I’m sending 5 packets a second and it fails after around 125 packets were successfully transmitted.

Has anyone experienced anything similar to this with the Spark Core? This seems to be a pretty serious issue that causes my entire system to not function correctly. I was looking at the Cyan flash of death thread and noticed some similarities even though I am using a local setup. So maybe this is a driver issue with the CC3000?

It doesn’t make much sense that my connection was working fine before.

Hi @tsteltzer,

This looks like a familiar ACK storm that we saw after the ARP failure on the cc3000, can you try applying the CC3000 patch and see if that clears up this issue for you? This is not yet the official version, but I think it will help.

Download this:
https://github.com/spark/cc3000-patch-programmer/blob/nobuttons/build/cc3000-patch-programmer.bin?raw=true

Run this:
dfu-util -d 1d50:607f -a 0 -s 0x08005000:leave -D cc3000-patch-programmer.bin

You’ll have to do a factory reset, or re-flash your code over DFU when done.

Thanks,
David

Thanks for the suggestion @Dave!

I tried this but unfortunately the behavior was exactly the same. :frowning:

You patched with the latest patch, the special version I posted?

edit: looks like the url was hidden, that version is special

No, I used the version found on Spark’s github page. I didn’t notice the special version.

Will try again!

1 Like

@Dave Sadly, still no luck…

Still seeing same behavior as before.

Is there any info I could give you that could help with figuring this out?

Hmm…

As far as I know, the latest cc3000 patch should do a better job of managing the internal buffers while the core is sending data on a very busy network. However, in very high ARP traffic situations, the cc3000 can still lock up. I’m curious, if smaller packets, or sending slightly less frequently would have different failure timings? i.e. maybe 4 times a second fails after 60 seconds, maybe 3 times a second never fails, etc. Any chance you have a minimal failure case in code we can try?

Thanks!
David

@Dave Good suggestion. I tried sending 2 packets a second instead of 5 and the results were pretty interesting. The same packet retransmission error happened the exact amount of time from the time the first packet was sent. So it still failed at 24 seconds but less than half the number of packets as before were successfully sent. (only about 50 instead of 120 during the 24 seconds) Does this seem to point to anything?

I have attached my Spark Core firmware as well as my related node js TCP/ UDP server app. Let me know if you have any questions regarding any of the code.

Thanks so much for all your help, really appreciate it!

Spark Core Firmware

//Magnetic Sensor Spark Core App

//Include header to disable cloud connectivity by default
#include "application.h"
#include "spark_disable_cloud.h"


TCPClient tcpclient;                    //Define TCP client object
UDP Udp;                                //Define UDP object
IPAddress serverIP;                     //Define Server IP address of IPAddress data type
int UDPport = 8000;						//UDP port number
int TCPport = 9000;						//TCP port number
bool receivedIP = false;                //Boolean to indicate whether the IP has been recieved yet
unsigned long interval = 500;           //Time interval between each transmit event (ms)
unsigned long previousMillis = 0;       //Time of previous transmit event
byte array[32];                         //Byte array to be transmitted to server over TCP connection
String coreID = String(1);              //Spark Core ID number
String null = " ";                      //Null character
String xlabel = "x";                    //X axis label
String ylabel = "y";                    //Y axis label
String zlabel = "z";                    //Z axis label


//Function to transmit Magnetic Field data from Spark Core pins
void transmit()
{ 
  //If device is connected to TCP server, read and transmit data
  if (tcpclient.connected()) 
  {
		  
	  //Assign current milliseconds passed since Spark Core turned on to currentMillis
	  unsigned long currentMillis = millis();
	  
	  //If time passed since last transmit is larger or equal than interval, read and send data
	  if( currentMillis - previousMillis >= interval) {

		  //Assign current time to previous to reset interval counter
		  previousMillis = currentMillis;

		  //Read and scale pin voltage values
		  String xval = String((analogRead(A0)*3.3)/4095);
		  String yval = String((analogRead(A1)*3.3)/4095);
		  String zval = String((analogRead(A2)*3.3)/4095);
		  
			//Insert voltage readings into byte array
			for(int i=0;i<7;i++)
			  {   array[i+5] = xval[i]; }
				  
			for(int i=0;i<7;i++)
			  {   array[i+15] = yval[i]; }
				  
			for(int i=0;i<7;i++)
			  {   array[i+25] = zval[i]; }
			
			//Print status to Serial connection
			Serial.println("Sending TCP packet");
		    
		    //Transmit byte array
		    tcpclient.write(array,32);
	  }
    }
  else	//If not connected to client, delay then attempt to reconnect
  {
          
    	//Blink LED slowly to indicate trying to connect to TCP server (delay 4 seconds)
    	for(int i=0;i<2;i++)
    	{
    		digitalWrite(D7, HIGH);   // Turn ON the LED
    		delay(1000);               // Wait for 1000mS = 1 second
    		digitalWrite(D7, LOW);    // Turn OFF the LED
    		delay(1000);               // Wait for 1 second
    	}
    	
    	//Print status to Serial connection
    	Serial.println("Attempting to connect to: ");
    	Serial.println(serverIP);
    	
    	//Connect to TCP server at serverIP on port TCPport
        tcpclient.connect(serverIP, TCPport);
       }
	
}


//Function to listen for UDP packets with server IP broadcast by UDP server app
void UDPListen()
{
	//Print status to Serial connection
	Serial.println("UDP test executing");
	
	//Blink LED fast to indicate waiting for UDP packet with IP (delay 4 seconds)
	for(int i=0;i<10;i++)
	{
		digitalWrite(D7, HIGH);   // Turn ON the LED
		delay(100);               // Wait for 200mS = .2 second
		digitalWrite(D7, LOW);    // Turn OFF the LED
		delay(100);               // Wait for .2 second
	}
	
	//Look for existance of UDP packet in Spark Core buffer
	Udp.parsePacket();
	
	//If remotePort of UDP packet received is the same as application
	//UDP port, assign IP of remote connection to serverIP and set
	//recievedIP boolean to true
	if(Udp.remotePort() == UDPport)
	{
		serverIP = Udp.remoteIP();
		receivedIP = true;
	}
	else    //Else, close UDP connection and reopen it to continue listening
	{
		Udp.stop();
		delay(100);
		Udp.begin(UDPport);
	}
	
}


//Initial setup (code run only once at startup)
void setup() 
{
	
	Serial.begin(9600);			//Begin Serial connection to send status of Spark Core
	Udp.begin(UDPport);			//Listen for incoming UDP packets from UDP server
	
    //Initialize Spark Core pins 
    pinMode(A0, INPUT);
    pinMode(A1, INPUT);
    pinMode(A2, INPUT);
    pinMode(D7, OUTPUT);
    
    //Assign byte array labels and spaces
    array[0] = null[0];
    array[1] = coreID[0];
    array[2] = null[0];
    array[3] = xlabel[0];
    array[4] = null[0];
    array[12] = null[0];
    array[13] = ylabel[0];
    array[14] = null[0];
    array[22] = null[0];
    array[23] = zlabel[0];
    array[24] = null[0];
}


//Code to infinitely loop after setup has run
void loop() 
{	
  //If device is connected to Wifi, listen for UDP packets then connect to TCP server
  if (WiFi.status() == WIFI_ON) {
		  
	//If server IP address was received, transmit data, else listen for UDP packet
	if(receivedIP)
	{
		transmit();
	}
	else 
	{
		UDPListen();
	}
  }
}

Node.js TCP/ UDP application

//-----------------------------------------------------------------------------------------------------------------------

// UDP server that broadcasts UDP packet to network every 5 seconds

//Declare required variables
var dgram = require('dgram');					//Include datagram library
var testMessage = 'testmessage';				//Define testMessage variable (packet payload)
var broadcastAddress = '192.168.1.255';			//Define network broadcast IP address
var UDPPort = 8000;								//Define UDP port

//Create UDP socket
var udp_socket = dgram.createSocket('udp4');	

//Bind Socket on UDPport and set to broadcast packets
udp_socket.bind(UDPPort, '0.0.0.0', function() {
    udp_socket.setBroadcast(true);
});

//Send UDP packet every 5 seconds
var udpInterval = setInterval(function () {
    udp_socket.send(new Buffer(testMessage),
        0,
        testMessage.length,
        UDPPort,
        broadcastAddress
    );
}, 5000);

//-----------------------------------------------------------------------------------------------------------------------

//TCP server to connect and communicate with Spark Core TCP client

//Declare required variables
var net = require('net');   	//Include net library
var fs = require('fs');     	//Include filestream library
var os = require('os')      	//Include OS library
var tcpPORT = 9000;            	//Define TCP server port
var crlf = new Buffer(2);		//Create buffer to generate new line
crlf[0] = 0xD; //CR - Carriage return character
crlf[1] = 0xA; //LF - Line feed character

//-----------------------------------------------------------------------------------

//Get server IP address
var interfaces = os.networkInterfaces();
var serverip = [];
for (k in interfaces) {
    for (k2 in interfaces[k]) {
        var address = interfaces[k][k2];
        if (address.family == 'IPv4' && !address.internal) {
            serverip.push(address.address)
        }
    }
}

//-----------------------------------------------------------------------------------

//Get date and time for timestamp
function getDateTime() {

    var date = new Date();

    var hour = date.getHours();
    hour = (hour < 10 ? "0" : "") + hour;

    var min = date.getMinutes();
    min = (min < 10 ? "0" : "") + min;

    var sec = date.getSeconds();
    sec = (sec < 10 ? "0" : "") + sec;

    var mil = date.getMilliseconds();
    mil = (mil < 10 ? "0" : "") + mil;

    var year = date.getFullYear();

    var month = date.getMonth() + 1;
    month = (month < 10 ? "0" : "") + month;

    var day = date.getDate();
    day = (day < 10 ? "0" : "") + day;

    return year + ":" + month + ":" + day + ":" + hour + ":" + min + ":" + sec + "." + mil;

}


//-----------------------------------------------------------------------------------

//Tell user what IP address and port the TCP server is listening on
console.log("TCP Server listening at " + serverip[0] + " : " + tcpPORT);

//Define server
var server = net.createServer(function (socket) {

    var remoteaddress = socket.remoteAddress,
        remoteport = socket.remotePort;
    
	//Tell user when a device has connected to TCP server
    console.log("CONNECTED: " + socket.remoteAddress + ":" + socket.remotePort);
    
    //Read and store data when data event is triggered
    socket.on('data', function (data) {
        data = data.toString();	
        fs.appendFile("magdatacore1.txt", data + " at: " + getDateTime() + crlf);
    });

}).listen(tcpPORT);

//-----------------------------------------------------------------------------------------------------------------------

Hi @tsteltzer

Thanks for posting the code–that helps a lot!

With both UDP and TCP connections and a bunch of Arduino String objects, you could be close to running out of memory.

Do you get a red flashing LED when your TCP problem happens?

Does your code recover ever? You say all activity stops, but what is the main LED on the core doing?

Maybe a good test would be to temporarily remove the UDP parts and just open a TCP connection and start transmitting (you have to have your node.js code ready). You could also try temporarily removing the Arduino Strings and just send zeros or raw ADC values.

Thanks for the post @bko!

That’s a good point, I will try your suggestion later today!

To answer your question, no I never see a flashing red LED or any indicator from the Spark Core that something is wrong. In fact, I have a serial message set to send every time a TCP packet is transmitted and that serial message is still being printed even after the TCP packets stop sending. The Spark Core LED remains a breathing green throughout the entire process indicating an active connection to a Wi-Fi network but not the cloud.

Seems like @bko provided the suggestion that enabled me to figure it out! I removed the UDP client component of my code and the Core continuously transmitted TCP packets flawlessly. Since I still wanted to dynamically discover my server IP address, I worked a bit to figure out what exactly was wrong with my UDP server application…

I figured out that in my code I began my UDP client in my setup function and never closed it after I was finished with it. All I did to solve this was to move the UDP.begin(UDPport) code to the beginning of my UDPListen function and added UDP.stop() to the end of the same function. This caused the UDP client to only be listening for packets within the scope of the UDPListen function.

Somehow, the UDP client remaining open caused major issues with my TCP communication. I have pasted my revised firmware below for reference if anyone has similar problems.

Revised Spark Core Firmware

//Magnetic Sensor Spark Core App

//Include header to disable cloud connectivity by default
#include "application.h"
#include "spark_disable_cloud.h"


TCPClient tcpclient;                    //Define TCP client object
UDP Udp;                                //Define UDP object
IPAddress serverIP;     //Define Server IP address of IPAddress data type
int UDPport = 8000;						//UDP port number
int TCPport = 9000;						//TCP port number
bool receivedIP = false;                //Boolean to indicate whether the IP has been recieved yet
unsigned int interval = 200;           			//Time interval between each transmit event (ms)
unsigned long previousMillis = 0;       //Time of previous transmit event
unsigned long currentMillis = 0;		//Current time
byte array[32];                         //Byte array to be transmitted to server over TCP connection
char coreID = '1';              		//Spark Core ID number
char null = ' ';                      	//Null character
char xlabel = 'X';                    	//X axis label
char ylabel = 'Y';                    	//Y axis label
char zlabel = 'z';                    	//Z axis label
String xval = "";						//String to hold x value
String yval = "";						//String to hold y value
String zval = "";						//String to hold z value


//Function to transmit Magnetic Field data from Spark Core pins
void transmit()
{ 
  //If device is connected to TCP server, read and transmit data
  if (tcpclient.connected()) 
  {
		  
	  //Assign current milliseconds passed since Spark Core turned on to currentMillis
	  currentMillis = millis();
	  
	  //If time passed since last transmit is larger or equal than interval, read and send data
	  if( currentMillis - previousMillis >= interval) {

		  //Assign current time to previous to reset interval counter
		  previousMillis = currentMillis;

		  //Read and scale pin voltage values
		  xval = String((analogRead(A0)*3.3)/4095);
		  yval = String((analogRead(A1)*3.3)/4095);
		  zval = String((analogRead(A2)*3.3)/4095);
		  
			//Insert voltage readings into byte array
			for(int i=0;i<7;i++)
			  {   array[i+5] = xval[i]; }
				  
			for(int i=0;i<7;i++)
			  {   array[i+15] = yval[i]; }
				  
			for(int i=0;i<7;i++)
			  {   array[i+25] = zval[i]; }
			
			//Print status to Serial connection
			Serial.println("Sending TCP packet");
		    
		    //Transmit byte array
		    tcpclient.write(array,32);
	  }
    }
  else	//If not connected to client, delay then attempt to reconnect
  {
          
    	//Blink LED slowly to indicate trying to connect to TCP server (delay 4 seconds)
    	for(int i=0;i<2;i++)
    	{
    		digitalWrite(D7, HIGH);   // Turn ON the LED
    		delay(1000);               // Wait for 1000mS = 1 second
    		digitalWrite(D7, LOW);    // Turn OFF the LED
    		delay(1000);               // Wait for 1 second
    	}
    	
    	//Print status to Serial connection
    	Serial.println("Attempting to connect to: ");
    	Serial.println(serverIP);
    	
    	//Connect to TCP server at serverIP on port TCPport
        tcpclient.connect(serverIP, TCPport);
       }
	
}


//Function to listen for UDP packets with server IP broadcast by UDP server app
void UDPListen()
{
	Udp.begin(UDPport);			//Listen for incoming UDP packets from UDP server
	
	//Print status to Serial connection
	Serial.println("UDP test executing");
	
	//Blink LED fast to indicate waiting for UDP packet with IP (delay 4 seconds)
	for(int i=0;i<10;i++)
	{
		digitalWrite(D7, HIGH);   // Turn ON the LED
		delay(100);               // Wait for 200mS = .2 second
		digitalWrite(D7, LOW);    // Turn OFF the LED
		delay(100);               // Wait for .2 second
	}
	
	//Look for existance of UDP packet in Spark Core buffer
	Udp.parsePacket();
	
	//If remotePort of UDP packet received is the same as application
	//UDP port, assign IP of remote connection to serverIP and set
	//recievedIP boolean to true
	if(Udp.remotePort() == UDPport)
	{
		serverIP = Udp.remoteIP();
		receivedIP = true;
	}
	else    //Else, close UDP connection and reopen it to continue listening
	{
		Udp.stop();
		delay(100);
		Udp.begin(UDPport);
	}
	
	Udp.stop();		//Close UDP connection
}


//Initial setup (code run only once at startup)
void setup() 
{
	
	Serial.begin(9600);			//Begin Serial connection to send status of Spark Core
	
    //Initialize Spark Core pins 
    pinMode(A0, INPUT);
    pinMode(A1, INPUT);
    pinMode(A2, INPUT);
    pinMode(D7, OUTPUT);
    
    //Assign byte array labels and spaces
    array[0] = null;
    array[1] = coreID;
    array[2] = null;
    array[3] = xlabel;
    array[4] = null;
    array[12] = null;
    array[13] = ylabel;
    array[14] = null;
    array[22] = null;
    array[23] = zlabel;
    array[24] = null;
}


//Code to infinitely loop after setup has run
void loop() 
{	
  //If device is connected to Wifi, listen for UDP packets then connect to TCP server
  if (WiFi.status() == WIFI_ON) {
		  
	//If server IP address was received, transmit data, else listen for UDP packet
	if(receivedIP)
	{
		transmit();
	}
	else 
	{
		UDPListen();
	}
  }
}
2 Likes

Hello @Dave,

I was also having failures with the ACK storm and the patch made a very significant improvement. I am interested in knowing what branch of the cc3000-patch-progammer has these fixes implemented and if there has been other improvements in the development branch that might be of interest. Also, is there a projected date to roll out an official patch?

Thanks,
Sammy

1 Like

Hi @s1234p1234,

We’re very close to having an official version of the cc3000 patch tested and ready to roll out, unless something really unexpected comes up, I think the plan is for it be released on / before Friday of this week. :slight_smile:

Thanks,
David

3 Likes