TCPClient HTTP Request Too Slow

I created an HTTP Request with some code from @Azdle in another post. I’m looping through and just reading a small robots.txt file off of a web site. When I pull the file in my browser the response is in a couple hundred milliseconds, but when I pull the file from my Spark board, it averages 1.8 seconds.

Here is the code. Feel free to try with your favorite robots.txt file and your favorite “someurl”. Is there a way to make the connection faster? Can we open a socket and keep it open so multiple requests can be made? Are others seeing the same speed problem?

What I’m really trying to do is push data to a server with a POST and it was crazy slow (10-20 seconds). I thought this simple example would show the issue to others.

TCPClient client;
char server[] = "";

const unsigned long interval = 5*1000;  // delay between updates, in milliseconds
unsigned long failCounter = 0;
unsigned long totalFailCounter = 0;
unsigned long totalCounter = 0;
unsigned long totalElapsed = 0;

void setup() {

void loop() {
  unsigned long startTime = millis();
  Serial.println(" -- Stats -- ");
  Serial.print("    Attempts: ");
  Serial.print("    Failures: ");
  if (client.connect(server, 80)) 
  	failCounter = 0;
    Serial.println(" -- Connected -- ");

    client.println("GET /robots.txt HTTP/1.1");
    client.println("User-Agent: Spark/? Test/0");
    client.println("Connection: Close");


      char c =;


    unsigned long elapsed = millis()-startTime;
    totalElapsed += elapsed;
    Serial.print("Elapsed Time:");
    Serial.print("Average Elapsed Time:");
    Serial.println(totalElapsed/(totalCounter - totalFailCounter));
    Serial.println(" -- Connection Failed -- ");
    Serial.print("    Count: ");

Mine was kinda fast when I was playing with TCPclient in the morning…

How about trying a local server?

Hi @ron

The code above could be doing a lot of printing at 9600 baud or 960 characters per second depending on the size of the file. This is about 1ms per character, so 1.8 seconds is only a bit over 1700 characters. Could the serial port be slowing you down?

Maybe you could measure the time without the printing to serial?

If you need to print to the serial port, you could try a high baud rate.

My experience reading yahoo weather web pages with the core is that it is not slow.

@bko i wanted to say that but I wasn’t that sure. haha!

Cos i printed earlier and seemed ok…but maybe if i logged that time taken it’s not so ok :smile:

I turned off the printing of the response (it was only a 3 line robots.txt file) and it had no impact on the time. It is still looking like 1.9 sec per request average.

I added some more metrics and the result was interesting. I separately tracked the write portion, the wait for response portion, and then the reading the response portion. The third piece was far and away the slowest, even if the response was fairly small. Here are tests against my personal website (3-line robots.txt file) and Google’s much larger robots.txt file:

Transmit Elapsed Time:298
Waiting for Response Time:188
Roundtrip Elapsed Time:2233

Transmit Elapsed Time:199
Waiting for Response Time:195
Roundtrip Elapsed Time:1882

I’d appreciate if others could try the code and replicate or repudiate my results. That would eliminate concerns about my Wifi connection, provider, etc.

The only real difference I see between what you are doing and what I have is that I have an accept line in the GET request:

        client.println("Accept: text/html, text/plain");

Can you say what hosts you have tried?

Here are my results:

connect: 108
wait: 867
recv: 1795
stop: 104
Elapsed Time:2876

connected: 131
wait: 911
recv: 2070
stop: 127
Elapsed Time:3240

connected: 53
wait: 365
recv: 1931
stop: 49
Elapsed Time:2399

Receiving takes a chunk and interestingly, client.stop() takes more than I thought.

Thank you for the sanity check. “” wasn’t a real URL. :smile:

Now the question is how to reduce that time. It certainly limits what data can be captured, especially if the processor can’t do anything while it is doing much of that waiting.

I am pulling 3990 bytes of payload with a get from my test app on in ~2 second