Getting random wrong boolean values for a simple double comparison

I have this code (simplified):

int relayState = 0;
double temperature = 0;
double temperatureThreshold = 22.0;

void setup() {
    Spark.variable("temperature", &temperature, DOUBLE);
    Spark.variable("relayState", &relayState, INT);
}

void loop() {
    delay(2000);
    temperature = sensor.getTemperature();
    relayState = temperature < temperatureThreshold;
}

I am getting wrong values for that relayState boolean. I’m logging the data both to the serial port and to a Google spradsheet, and it looks like this:

date                    temp    relayState

11/11/2014 8:37:37      22.6    0
11/11/2014 8:47:11      22.3    0
11/11/2014 8:57:11      22.1    0
11/11/2014 9:07:11      21.9    1
11/11/2014 9:17:12      21.7    1
11/11/2014 9:27:11      21.5    0
11/11/2014 9:37:11      21.4    1
11/11/2014 9:47:11      21.2    1
11/11/2014 9:57:18      21.1    1
11/11/2014 10:07:11     21      0
11/11/2014 10:17:11     20.8    1
11/11/2014 10:27:11     20.7    0
11/11/2014 10:37:11     20.7    1
11/11/2014 10:47:11     20.6    0
11/11/2014 10:57:12     20.5    0
11/11/2014 11:07:12     20.4    1
11/11/2014 11:17:12     20.3    0
11/11/2014 11:27:27     20.2    1

How can there be temperature values below 22.0 that result in a false relayState value?

I don’t know the answer to your question but I’d be interested to see what would happen if you did this instead:

relayState = (int)(0.0 > (temperature - temperatureThreshold));

It is possible that what’s happening is the result of int to double and double to int conversion. In fact I’m sure that’s the issue

2 Likes

I thought about that kind of issue as well, but in the code shown there are no int/double conversions. temperature, temperatureThreshold, and the return value of sensor.getTemperature() are all double. There is no conversion to or from int.

The result type of < should be int so no conversion there either.

That’s a good point… I forgot we were doing a logical check before doing the logical to int conversion. Silly me.

Have you tried what I suggested? Just curious what happens.

I’d have to read up some more on how automatic conversion is done. Perhaps some kind of conversion is still happening. It would be interesting to also try declaring relayState as a double and see what happens

I haven't tried your way of doing the comparison yet, I will do that next and report back how that goes. Thanks!

Strange! Could you please post the entire code?

Sure, below is the entire code of the main file. The sensor library code is here:

I also published that as a public library here on spark.io under the name “RHT03-HUMIDITY-TEMPERATURE-SENSOR”

// This #include statement was automatically added by the Spark IDE.
#include "rht03-humidity-temperature-sensor.h"

double humidity = 0;
double temperature = 0;

int sensorPin = D0;
int relayPin = D1;
int relayState = 0;
double temperatureThreshold = 23.0;

RHT03HumidityTemperatureSensor sensor(sensorPin);

void setup() {
    Serial.begin(9600);
    Spark.variable("humidity", &humidity, DOUBLE);
    Spark.variable("temperature", &temperature, DOUBLE);
    Spark.variable("relayState", &relayState, INT);
    pinMode(relayPin, OUTPUT);
    digitalWrite(relayPin, LOW);
}

void loop() {
    delay(2000);

    sensor.update();
    temperature = sensor.getTemperature();
    humidity = sensor.getHumidity();

    relayState = temperature < temperatureThreshold;
    digitalWrite(relayPin, relayState);

    Serial.println("t/rh/rs");
    Serial.println(temperature);
    Serial.println(humidity);
    Serial.println(relayState);
}

@harrisonhjones’ variant does seem to work correctly. I haven’t seen a wrong relayState value yet. I’d be curious to hear from the Spark team if this is correct behavior.

1 Like

I would be very tempted to try it without the Arduino/Spark preprocessor, assuming you can write some function prototypes to make it compile as a straight C/C++ program.

#pragma SPARK_NO_PREPROCESSOR
#include "application.h"
2 Likes

I spoke too soon, even @harrisonhjones’ version produces erratic values (threshold is 23.0 here):

11/12/2014 17:37:12     22.5    1
11/12/2014 17:47:12     22.3    1
11/12/2014 17:57:12     22.2    0
11/12/2014 18:07:12     23.7    0
11/12/2014 18:17:11     22.8    1
11/12/2014 18:27:13     22.4    1
11/12/2014 18:37:12     22.3    1
11/12/2014 18:47:11     22.3    1
11/12/2014 18:57:11     22.8    0
11/12/2014 19:07:12     23.2    0
11/12/2014 19:17:11     22.9    1
11/12/2014 19:27:11     22.8    1

interesting…I tried!

But seriously have you tried @bko’s suggestion? The spark preprocessor can be ruthless sometimes. My go-to ‘i’ve tried everything what do I do now’ troubleshooting step is to disable the preprocessor.

I will try that next. Is #pragma SPARK_NO_PREPROCESSOR documented somewhere? I wasn’t aware of that. My code seemed to compile fine when I added it, so I’ll let it run for a while like this now.

You could try converting the condition to a boolean if/else, like so:

bool relayState = (temperature < temperatureThreshold) ? true : false;

or

int relayState = (temperature < temperatureThreshold) ? 1 : 0;

Here’s my thought, get rid of the uncertainty of your actual temperature reading and just make sure the code works right first. I created an array of doubles with the values you reported in your first results post. Then a matching array of expected results of the expression value < control. Then I tried all of the different methods suggested here after the standard if else logic. Then I looped it over and over to make sure the actual double comparison did not change from iteration to iteration due to some weird non-atomic operation in the code. Results are that every way I try testing it works fine, consistently. I even tried adding weird random extra precision to the values in the array which your temperature reading probably have? Maybe not, I seem to recall Serial.print() truncations float/double precision… but thought the default was 2 decimal places.

#Code

#include "application.h"

double dControl = 23.0;
double dArray[12] =
{ 22.5, 22.3, 22.2, 23.7, 22.8, 22.4, 22.3, 22.3, 22.8, 23.2, 22.9, 22.8 };
double expectedResults[12] =
{    1,    1,    1,    0,    1,    1,    1,    1,    1,    0,    1,    1 };

void setup()
{
  Serial.begin(9600);      // sets the serial port to 9600
  pinMode(D7,OUTPUT);
}

void loop()
{
    Serial.println("\n============ DOUBLE TEST 1 ============");
    for (int x=0; x<12; x++) {
        Serial.print("Expected: ");
        Serial.print(expectedResults[x]);
        if (dArray[x] < dControl) {
            Serial.println("\tActual: 1");
        }
        else {
            Serial.println("\tActual: 0");
        }
    }

    int result2 = 0;
    int result3 = 0;
    int result4 = 0;

    Serial.println("\n============ DOUBLE TEST 2 ============");
    for (int x=0; x<12; x++) {
        Serial.print("Expected: ");
        Serial.print(expectedResults[x]);
        result2 = dArray[x] < dControl;
        Serial.print("\tActual: ");
        Serial.println(result2);
    }

    Serial.println("\n============ DOUBLE TEST 3 ============");
    for (int x=0; x<12; x++) {
        Serial.print("Expected: ");
        Serial.print(expectedResults[x]);
        result3 = (dArray[x] < dControl) ? 1 : 0;
        Serial.print("\tActual: ");
        Serial.println(result3);
    }

    Serial.println("\n============ DOUBLE TEST 4 ============");
    for (int x=0; x<12; x++) {
        Serial.print("Expected: ");
        Serial.print(expectedResults[x]);
        result4 = (dArray[x] < dControl) ? true : false;
        Serial.print("\tActual: ");
        Serial.println(result4);
    }

    digitalWrite(D7, !digitalRead(D7));  // toggle blue LED
    delay(1000);
}


#Results

============ DOUBLE TEST 1 ============
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 0.00  Actual: 0
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 0.00  Actual: 0
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1

============ DOUBLE TEST 2 ============
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 0.00  Actual: 0
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 0.00  Actual: 0
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1

============ DOUBLE TEST 3 ============
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 0.00  Actual: 0
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 0.00  Actual: 0
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1

============ DOUBLE TEST 4 ============
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 0.00  Actual: 0
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
Expected: 0.00  Actual: 0
Expected: 1.00  Actual: 1
Expected: 1.00  Actual: 1
2 Likes

Thanks a lot for the thorough testing. I think I’ll look a bit more thoroughly at the sensor output data.

But that the line you want to replace were not working would require an error in the complier. I’ve yet to spot compiler errors - its gcc so unlikely to be at fault, or, at least, to have such a basic fault.

I think the library you’re using may be at fault. It’s using “<<” left-shift bit operator and I think the humidity value may be trampling on the temperature value. I think the original library is assuming 16bit int rather than 32bit?

Note that the C-compiler spec says always sizeof(short)==16, and always sizeof(long)==32 but sizeof(int) cab be 16 or 32. This is always true. On a 16-bit processor an int (or unsigned int) is usually 16 bits, and usually 32 bits on a 32bit processor. I think the vars defined as int in your library some of them need to be redefined as short so as (1) to remain portable and (2) to work on Spark as well as on Arduino.

Sigh!

While it might be on the docs somewhere it’s essentially “documented” on the forums… not ideal but I don’t know if they can put a “If you’ve tried everything and it’s still not working try this” section.

#pragma SPARK_NO_PREPROCESSOR is a band-aid for a bigger problem: the preprocessor and hopefully it will be fixed at some point.

As a general principle I thought it was accepted by the powers that be that all the info we need to be able to use the Spark Core to its fullest intended extent should be available in the proper docs. All acknowledge this isn’t the case, but some seem to regret this not at all. At the risk of once again being told I am being snarky, there is information in the heads of some insiders which could be easily documented, should be documented, and is not being documented. Having to resort to the forum is not ideal, as you yourself say.

As to this specific issue there is just no way that anyone could realistically be expected to know of the existence of #pragma SPARK_NO_PREPROCESSOR directive. That it exists is not as useful to me (or anyone else) as you might think: What are the circumstances and consequences of its use? Or is it another of the several things we need to try in various combinations (2^several) to get code working? Or must I read the source code?

Having to become intimately familiar with the source code of the firmware of the Spark Core ought not to be a prerequisite to using the thing, and those who hold a contradictory POV do, in my (snarky) view, sabotage the project.

The ambition of the Spark.io organisation seems (rightfully) to me to conquer the Internet of Things world. The handful of well-intentioned source code experts doling out helpful advice, but seemingly not documenting what they know, is not scalable: The forum is not easily indexable or searchable, there is bad and good and contradictory advice, and no way to tell the difference.

Is this another “shoot the messenger” posting from he with the objectionable tone?