Core flashes red after running for an hour

Hello All,

I have uploaded a video that shows the red flashes occurring after an hour of running my program and I would need some help identifying what the core is trying to tell me.
The program is lengthy and compiled locally so I cannot post it here. I did not catch the beginning of the red flashing but it started with green flashes before I could start the recording. It looks like the core restarts itself and then runs into the same problem immediately. After a second self-restart it runs fine again.

Markus

Hi @MarkusL

You are getting a red SOS followed by 8 flashes which indicates an out of heap memory condition.

Does your code do dynamic memory allocation? Possibly by Arduino String objects?

1 Like

Hi @bko, thanks for the analysis. I am not using any String objects. What is also interesting is that sometimes after powering up the Spark it goes into the red flashing immediately after cyan flashing and the auto reboot will always end up in red flashing. Only removing it from the USB power gets it out of this endless loop.
Are there other debugging methods to retrieve a caller stack dump or some other useful info?

After tinkering around a good part of Sunday a I am stuck. I either get succesfull running of my code or 8 red flashes - this seems random. Since the Serial communication via USB is not available after restart of the core (seems only available after succesfull cloud connect) I tried Serial2. But once I try using that I get 1 red flash (hardware trouble).
Not really sure what to do other than trying to package all my code and posting it here.

Hi @MarkusL It’s going to be very difficult to understand what’s going on with your program without looking at the code. As I understand it the firmware uses dynamic memory allocation, so it may that you are not leaving enough ram after your static allocations for the firmware to allocate out of the heap properly. This is only a guess but thought you may not be aware that the firmware uses the heap.

1 Like

Hi @mtnscott and @bko, I have uploaded all my code in a zip here. It has been made to compile with the Spark Dev app, not the web IDE. It is a complete archive and if you extract into a new folder it should compile with no errs.

Hi @MarkusL

I just had a quick look and one thing jumped out at me in the ups_controller files. You are using a 9-byte char array for the results of printing some floats (single-precision) but a single float can require more characters than that. Are you sure you have enough room? This would possibly explain why it works for a while with some values but fails after some time. Does your code take care to not walk past the array bounds?

Here is an example of the largest positive float that takes 12-bytes to print correctly:

3.402823e+38

3 Likes

Hi @MarkusL
In your function create2ndRow all of your strncat calls are not preventing an overrun of your buffer.
strncat(longRow, "some string", LEN_LONGROW)

the function strncat(dest, src, max) will copy up to max bytes from src onto the end of dest. Having max = LEN_LONGROW is not doing anything because in your case strlen(src) is always < LEN_LONGROW.

You may want to use the strlcat equivalent function. In that case your use of LEN_LONGROW would be correct.

3 Likes

Thanks for the first assesment, did you actually flash the code on a spark?

@bko: all float strings are truncated to no more than 2 digits after the decimal point so that does not seem to be a problem.
@mtnscott: you are right about strncat, I should switch to using strlcat - but the current combined string does not reach that (max about 175 charactes), here is a example of that string returned via cloud access:

{
  "cmd": "VarReturn",
  "name": "status",
  "result": "0.0�C/32.0�F, 0' stby, 0 days, 689W, 0w, 21.9Vs, 10.9Vb, 31.5As, -31.8Ab,  10.92VbMin,  0.00maxC,  727.74Wps,  381.10Wp,  1.53Whu,  23.41Wh*,  0.00Wh",
  "coreInfo": {
    "last_app": "",
    "last_heard": "2015-01-09T19:17:03.835Z",
    "connected": true,
    "deviceID": "54ff6c066672524835431267"
  }
}

Just did, I'm getting the same kind of problems, panic w/ 8 flashes or flash flashing cyan. I can't get it to work at all.

1 Like

@MarkusL

  1. you were trying to use the serial port before you initialized it. The first call in your setup was firstOneWireDevices(); and that was calling Serial.print(). I moved your call to Serial.begin(9600); up in your setup and changed it to 9600 baud.
  2. I think you are using too much ram. When I compile your code you only are leaving 2.4K bytes of SRAM for the heap. A few times it would work after I flashed the core, most of the time it would enter a panic w/ 8 flashes. I reduced your SRAM consumption by setting SPARK_ARRAY_LENGTH 222 from 622 and that made 4.4K available for the heap. Now it appears to be flashing and running everytime. I don’t know how long you expect the spark string variables will get but I’m starting to believe 2.4K SRAM for the heap is not enough.
4 Likes

@mtnscott thanks a lot!
Reducing the ram did the trick but also creates another problem for me because I need to transmit a lager dataset into the cloud. Maybe I can create that 622 byte array dynamically and not as a static array.

1 Like

@MarkusL I’m not sure moving a large allocation into the heap will help. It’s all about having more free memory.

I noticed your static variable g_power I scanned the code and you are only assigning it once. I don’t see where is is used. Maybe you can evaluate if you really need it. That may free up enough memory for you.

There are parts which are currently only placeholders and it’s work in progress. I plan to store statistical sensor data in the eeprom and then transmit them on demand to the cloud.

That wouldn't work since a Spark.variable cannot be relocated once it was set up.
But when @mtnscott found extra 2K by reducing the SPARK_ARRAY_LENGTH by 400 byte, I'd guess (since I can't look at the zip, as it seems to be gone) you must have several arrays of that size hanging around.
Do you actually need them all, or could you "multiplex" just one Spark.variable?

If you are already running low on mem and have not even baked all the functionality into it, you may have to rethink the overall layout of your project.

And if you talk EEPROM and sensor data, you might have to think about EEPROM wear. You have only a limited amount of write cycles, so be careful with bursts of data.

You may want to consider an external I2C EEPROM chip that will provide you much higher erase cycles. Also the eeprom library that @mdma wrote actually uses flash The flash has much lower erase cycles and access to that specific flash shares the communications path with the CC3000, so you may run into problems when reading flash and transmitting the data over the network.

Another alternative would be to use a cloud db server for storing your statistical data. I wrote a library PietteTech_Phant that will publish data over the network to a server. There is an example with the library where I use SparkFun's free data storage (public or private). Then you can store all the data you want and then read it for analysis without using the Spark.

2 Likes

@ScruffR I brought the zip back, try the download again.
Yes, you are right, a spark.variable cannot be re-allocated. I will be able to further cut down the usage of char arrays but I will need to include either TCP or UDP lib now and this will take up even more resources I believe.
As far as EEPROM usage goes I need to write into a circular buffer once a day 622 bytes. That should not be too hard on the usage.

@mtnscott external EEPROM is not an option since I need to keep complexity and HW cost low (here is a link to the HW I am working on).
Storing data in another cloud is an interesting idea but this will create another cost factor for the user since these services require subscription.

Hey All,

If you need a ton of local storage, you can optionally wire up an SD card to the Core/Photon. That’d give you a huge amount of local space. There’s a thread that talks about wiring one up here: https://community.spark.io/t/micro-sd-card-library/2666/42 – and a library here - https://github.com/technobly/SparkCore-SD

I’m definitely thinking about ways to make moving data to/from your devices easier but I probably won’t be able to beta those features until the summer. :slight_smile:

Thanks,
David

1 Like

It should be ok, the two devices are mutually excluded from accessing the internal SPI bus at the same time.

1 Like