Spark stops responding and temperature goes haywire

Hi,

I built a remote temperature and uptime monitor using a Spark Core that is accessed through a PHP web page. It also has a locally attached alphanumeric LED to show the temperature.

The code includes two spark functions to remotely enable/disable the LED and two spark variables to publish the temperature and uptime and is posted below.

Everything works smoothly; however, after running for somewhere between 2 and 4 days, The system stops functioning. A reflash brings it back to normal, but I don’t think re-flashing should be required.

The problem manifests itself as follows:

  1. The temperature reads 492 and never changes. (Clearly this is NOT the actual temperature.)
  2. Uptime is displayed normally
  3. The LED freezes in its last state - If off, it stays off, if on, it stays on shows a temperature that is NOT 492.
  4. The system stops responding to LED on/off requests. --> The cloud JSON response indicates that the request has been received and processed.

As I have pondered the situation, it feels like there is some kind of code issue and perhaps a memory leak. Is there something that jumps out in the code below that could be causing this?

// This #include statement was automatically added by the Spark IDE.
#include "Adafruit_GFX.h"

// This #include statement was automatically added by the Spark IDE.
#include "Adafruit_LEDBackpack.h"

// This #include statement was automatically added by the Spark IDE.
#include "Adafruit_MPL3115A2/Adafruit_MPL3115A2.h"

Adafruit_MPL3115A2 baro = Adafruit_MPL3115A2();
Adafruit_AlphaNum4 alpha4 = Adafruit_AlphaNum4();

float tempF;
int tempFi;
boolean ledonoff = true;
boolean tempupdate = true;
unsigned long previousmillis; 
const long interval = 20000; //update every 20 seconds
char publishString[25];


void setup() {
  alpha4.begin(0x70);

  Spark.variable("Current_T", &tempFi, INT);
  Spark.variable("Uptime",&publishString, STRING);
  Spark.function("ledon", ledon);
  Spark.function("ledoff", ledoff);
  while(! baro.begin()) {
    delay(1000);
  }
  previousmillis = millis() - interval - 1000;
}

void loop() {
    unsigned long currentmillis = millis();
    if (currentmillis - previousmillis > interval){
      previousmillis = currentmillis;
      tempF = baro.getTemperature()*9/5+32;
      tempFi = (int) tempF;
      tempupdate = true;
      sparkPub(currentmillis);
      
    }
    if (ledonoff){
        if (tempupdate){
            displayascii(32,tempFi%100/10+48,tempFi%10+48,32);
            tempupdate = false;
        }
    }
    else{
      displayascii(32,32,32,32);  
    }
      
}

void displayascii(int first, int second, int third, int fourth){
    alpha4.writeDigitAscii(0, first);
    alpha4.writeDigitAscii(1, second);
    alpha4.writeDigitAscii(2, third);
    alpha4.writeDigitAscii(3, fourth);
    alpha4.writeDisplay();
}

void sparkPub(unsigned long now) {
    unsigned nowSec = now/1000UL;
    unsigned sec = nowSec%60;
    unsigned min = (nowSec%3600)/60;
    unsigned hours = (nowSec%86400)/3600;
    unsigned days = (nowSec/86400);
    sprintf(publishString,"%u days, %u:%u:%u",days,hours,min,sec);
}

int ledon(String args2){
    ledonoff = true;
    tempupdate = true;
    return 10;
}

int ledoff(String args2){
    ledonoff = false;
    return 20;
}

Thank you in advance for any thoughts!

The first thing I often see is this

Spark.variable("Uptime",&publishString, STRING);

Don’t put an ampersand & in front of a character pointer, it’s already an address.

But I’ll have some further looks
Edit: nothing major to find anymore, since @peekay123 got greedy and answered it all :+1: :wink:

2 Likes

@JL_678, on a quick look:

  1. are you confident that publishString[] NEVER exceeds 25 chars including the terminating ‘/0’? Overwriting that array will cause major chaos
  2. Also, have you considered using the Spark Time functions to get “real” time?
  3. You may want to declare “interval” as unsigned long

You may need to add some Serial.print debug messages to see if your barometer and display calls return in a timely fashion. The fact that the temperature is stuck may indicate an issue there.

What intrigues me is the fact that you do a reflash to recover instead of just a reset. So somehow, the code space is being affected. Check the items above and let me know how it goes :smiley:

3 Likes

AND @ScruffR beats me to that punch!!! Ok, I missed that one… good eye dude! :wink:

3 Likes

Hi,

Thank you both for the feedback! A couple of comments:

First, the reset via flash versus the reset button was my choice and so I never actually push the reset button. I can try that next time.

Regarding the array, it used to be 40 characters and I had the same issue. I shrank it because I thought that it might be more memory efficient.

I will implement all of the changes suggested and report back.

Thank you again.

2 Likes

Hi,

I have a quick and positive update. After implementing all of your suggested code tweaks, the Spark has been running for 6 days without a problem. (Of course, now I am jinxing myself! :smile: ) I am cautiously optimistic that these enhancements may have addressed the issue.

Thank you!

JL

3 Likes