[Solved] Client publish problems

Hi,
I have a simple working example of calling a URL and passing some GET over. Now I’ve tried to combine it with some SHTx sensor read outs but something is not working. Nothing gets send. It appears the server is connecting but I get a “sent and closed-bytes:0” and no GET vars arrive.

Might that be a timing issue or did I make some other mistakes? (temperature/humidity readout works fine and I get the sGET variable printed ok on my serial monitor).

#include "application.h"   
#include "SHT1x.h"          // This #include statement was automatically added by the Spark IDE.

extern char *dtostrf (double val, signed char width, unsigned char prec, char *sout);

//Delay timer to create a non-delay() x-second sampling timer 
unsigned long wait = millis();
const unsigned long waittime = 5000L;


// Specify data and clock connections and instantiate SHT1x object
#define dataPin  D0
#define clockPin D1
SHT1x sht1x(dataPin, clockPin);


// Publishing server
char server[] = "requestb.in";
char url[] = "1mlvy9c1?YES=1";
//char url[];
TCPClient client;


void setup() {
   Serial.begin(9600); // Open serial connection to report values to host
}
 
void loop() {

    // SHT1x    
    float SoilTemp_c;
    float SoilHumidity;

    if (millis() > wait) {
        wait = millis() + waittime;

        // Read temperature/humidity from the soil sensor
        SoilTemp_c = sht1x.readTemperatureC();
        SoilHumidity = sht1x.readHumidity();
        
        
        String sTemp = String(SoilTemp_c);
        String sHum = String(SoilHumidity);
        String sGET = "?SH=" + sHum + "&ST=" +  sTemp;
        
        int retval = getrequest(sGET);
        Serial.print("Returns ");
        Serial.println(retval);

     
        // Print the values to the serial port
        Serial.print("Temperature: ");
        Serial.print(SoilTemp_c, DEC);
        Serial.print("C / ");
        Serial.print(SoilHumidity);
        Serial.println("%");
delay(5000);     
    }
 
}


// Publishing function
//int getrequest(String url){
int getrequest(String sGET){
client.connect(server, 80);

if (client.connected()) {
        Serial.println("Connected to server.");
        client.print("GET ");
        client.print(url);
        client.print(sGET);
        client.println(" HTTP/1.1");
        client.print("Host: ");
        client.println(server);
        client.println("Connection: close");
        client.println();

        unsigned int count = 0;
        unsigned long lastTime = millis();
        while( client.available()==0 && millis()-lastTime<3000) { //ten second timeout
        }  //do nothing
        
        lastTime = millis();
        
        while( client.available() && millis()-lastTime<3000 ) {  //ten seconds
          client.read();  //flush data
          count++;
        }
        
        client.flush();  //for safety

        delay(400);
        client.stop();
        Serial.print("sent and closed-bytes: ");
        Serial.println(count);
        return 1;
     }
    else {
        client.flush();
        client.stop();
        Serial.println("Not connected");
        return 0;
    }
    
}
1 Like

You may need a / at the start of your url that’s the only thing I can see… should make a difference

This is logically and a bit odd at the same time, but it worked :slight_smile: Odd because the original example code works without the leading / and I didn’t see it adding it anytime later…

Thanks.

1 Like

post removed… it seems to work at the moment…

No the problem persists after lots of trying, I have no idea why. It has something to do with this line:

client.print("?SH=1");

As soon as I want to attach a GET parameters with another client.print the Spark Core dies on me with a red SOS and a Hard Fault which means factory reset for me most of the time.
Is it a timing problem or did I make something fundamentally wrong?

Code:

#include "application.h"    // mandatory!

// This #include statement was automatically added by the Spark IDE.
#include "SHT1x.h"

//Delay timer to create a non-delay() x-second sampling timer
unsigned long wait = millis();
const unsigned long waittime = 5000L;

// Specify data and clock connections and instantiate SHT1x object
#define dataPin  D0
#define clockPin D1
SHT1x sht1x(dataPin, clockPin);

float fSoilTemp_C;
float fSoilHumidity;

// Publish setup
char server[] = "requestb.in";
char url[]    = "/13waz4h1";
char token[]  = "23443556778953780522765322578901"; //2f56d618f4a7d345ea53dc364c4eeb43
TCPClient client;


void setup() {
   Serial.begin(9600); // Open serial connection to report values to host
   Serial.println("Starting up");
}
 
 
void loop() {

  if (millis() > wait) {                    // Delayed non delay() code
    wait = millis() + waittime;

    // Read values from the sensor
    fSoilTemp_C   = sht1x.readTemperatureC();
    fSoilHumidity = sht1x.readHumidity();
 
    // Print the values to the serial port

    Serial.print("Temperature: ");
    Serial.print(fSoilTemp_C);
    Serial.print("C / ");
    Serial.print("Humidity: ");
    Serial.print(fSoilHumidity);
    Serial.println("%");

    // Create URL parameters
    //String sGET = "?SH=" + String(fSoilHumidity) + "&ST=" + String(fSoilTemp_C) + "&TK=" + String(token);

    // Call publish function
    int retval = getrequest();

  }
}


int getrequest(){
  client.connect(server, 80);

  if (client.connected()) {
    Serial.println("Connected to server.");
    client.print("GET ");
    client.print(url);
    
    // GET parameter
    client.print("?SH=1");
    
    client.println(" HTTP/1.1");
    client.print("Host: ");
    client.println(server);

    client.println("Connection: close");
    client.println();

    unsigned int count = 0;
    unsigned long lastTime = millis();
    
    while( client.available()==0 && millis()-lastTime<10000) { //ten second timeout
    }  //do nothing

    lastTime = millis();

    while( client.available() && millis()-lastTime<10000 ) {  //ten seconds
      client.read();  //flush data
      count++;
    }
    
    client.flush();  //for safety

    //client.flush();
    delay(400);
    client.stop();
    Serial.print("sent and closed-bytes: ");
    Serial.println(count);
    return 1;
  }
  else {
    client.flush();
    client.stop();
    Serial.println("Not connected");
    return 0;
  }
}

Hey @exposure, sorry you are having trouble. I tried to replicate your issue but was unsuccessful (your code worked for me. I had to comment out all the SHT1x stuff however). My only suggestion right now is to add

SPARK_WLAN_Loop();

to your first while() loop (instead of “do nothing”)

For some reason it’s working now. 2 things have changed since it wasn’t:

  1. I’m on a different internet connection now (both are fast so I’m not sure if that’s an issue)
  2. I included your suggestion with the while loop

Now I’m facing another issue, can I not call a local IP address instead of a host name? I tried to call 192.168.0.10 where my local Apache/PHP is running but the requests never arrive.

Did you just change the server from “requestb.in” to “192.168.0.10”? That won’t work since DNS will not lookup that hostname. You can use IPAddress instead.

IPAddress server(192,168,0,10);

What about any changes the "Host: " line in your HTTP request?

@harrisonhjones, as far as I recall SPARK_WLAN_Loop() should not be called directly from user code, since it might get removed/renamed and seems to be undocumented.
But instead you should use Spark.process() which in turn makes use of SPARK_WLAN_Loop() or its future successor.

@ScruffR: I could be wrong but currently SPARK_WLAN_LOOP() does not call Spark.process(). See https://github.com/spark/firmware/issues/347 Last time I looked at the official firmware repo Spark.process() sets some flags to tell the Core to maintain the cloud connection. I doesn’t, yet, maintain the cloud connection on its own

I think this is an open issue that has not been resolved yet.

Right now, you must call SPARK_WLAN_Loop() but I think the future design will be that you will not.

So you are both right! :slight_smile:

1 Like

I haven’t checked myself, yet. But I understood the matter the other way round.
Spark.process() calls SPARK_WLAN_LOOP and as such does what you intended with your suggestion plus some extra stuff.
But I’ll have to check if I’m not just imagining things ;-).

On the other hand I’ve been using this code in setup() in every one of my sketches so far and it always seemed to do what I expected it would do.

#ifdef SERIAL_MONITOR
  Serial.begin(115200);
  while (!Serial.available())
    Spark.process();
#endif

From thisI just assumed I remembered and used it correctly.

From the official firmware repo:

void SparkClass::process(void)
{
#ifdef SPARK_WLAN_ENABLE
  if (SPARK_CLOUD_SOCKETED && !Spark_Communication_Loop())
  {
    SPARK_FLASH_UPDATE = 0;
    SPARK_CLOUD_CONNECTED = 0;
    SPARK_CLOUD_SOCKETED = 0;
  }
#endif
}

You should be correct but until the issue I posted to is resolved you are not, yet, correct

2 Likes

Thanks for the clarification :+1:

Edit:
I had a quick look and the reason why Spark.process() works for my use and it seems to come from the fact that the line

  if (SPARK_CLOUD_SOCKETED && !Spark_Communication_Loop())

implicitly calls SparkProtocol.event_loop() which does some cloud housekeeping.

2 Likes

Interesting. I was wondering why it worked for you as well. I’m glad you did some research. Cool stuff

Hallelujah! It’s working. Thanks!

@bko: Yes I did exactly what you assumed. I assumed that it would take either IPs or Host names…

Off to new frontiers plugging more functionality in but the “bones” are there. The first application is probably watering salad. Check soil humidity, open/close watering valve, monitor humidity & temp…

2 Likes