Spark Core not booting after Flashing my big firmware

I can sucessfully flash a small “blink” firmware to my :spark:.

But when I deploy my fancy firmware it keeps restarting. Sequence: green flashes, cyan flashes, then 4-6 red flashes, and it starts over with green flashes.

I posted my app here so you can try it as well. https://gist.github.com/synox/b96e0a47965880330122

It is distributed in many files and uses many libraries. It might be that it uses to much (programm) memory. But there is no warning in the web ide, so I can’t know.

Some help or tests would be great. :smile:

Regards

Coffee, it sounds exactly like an out-of-RAM problem, not program or flash space issue. Are you compiling on the web IDE or locally?

I am compiling online. i was expecting that the web ide would tell me if it was too big.

There is another thread currently going on about ensuring sufficient RAM:

@peekay123 @BDub @zach

I’m having a issue trying to load code to my Spark Core via the Web IDE.

I have code with lots of LCD screen Bitmap image data. The Web IDE compiles the code just fine but when I try to flash the code to my Spark Core it simply will not start sending the data to the Spark so no purple flashing LED even starts. I can flash code that is smaller just fine though.

So if your code is too large is it normal for the Web IDE to show it compiles successfully but for nothing to happen when you put the flash button?

It would be really nice if we had a way to see how much space our code is using in the Web IDE.

Yes if it’s too large it will hang, but I don’t know what part of the mechanism makes it hang. Programming locally is so much faster though so if you are developing something and needing to debug it, local is the way to go.

@BDub I erased some of the code and it then compiles fine so it must have been over sized. The Web IDE says its uploading but it never triggers the Spark Core to Flash Purple normally, it just sits there and does nothing.

About programming locally, is the new Local IDE that the Spark Team came out with easier or better than that old online tutorial I was using in the past with Netbeans and all that junk. I hope there is a new easier way. Let me know what the deal is since I will soon need to compile locally to be able to turn off the CC3000 as needed.

I’m not sure about the CLI because since I had issues when I first used it, I kind of stopped using it. I haven’t seen any updates about it, so I’m not sure if it has received any yet. It has a lot of potential though so I would like to get back to it for things. I “think” it basically sends your sources files up to the server to compile… like the files that you have in the webIDE… you would store those locally and invoke the compiler. This is cool, but still will yield the same results for you… I’d imagine.

I just compile from the command line with a batch file called M and a programming batch file called P. makes it pretty easy.

@BDub So I should just stick with mastering the process that I followed that worked for me in the past it sounds like. I had some issues with it for some reason that cost me 6 hours of frustration so I deleted everything and just waited for updates to hit the Web IDE. Looks like I’ll need to try to compile locally again following the step by step instructions.

RWB, the web IDE should warn the use when the RAM used goes over the known limit. Any thoughts BDud?

1 Like

@peekay123 @BDub I was basically saving about 13- 128x64 Bit Map Images that the Digole converter spit out and saved them in the main loop. There was no warning via the Web IDE about ram or memory size issues. I just said Successfully compiled! Good Job!

Then it would do nothing to the Spark Core when I hit flash. I deleted some of the BitMap data and it then would flash correctly.

Looks like I need to add a 500gig hard drive to my Spark :smiley:

RWB, go over to the digole topic. Once I figure out how, I will show you the sprite stuff I have been working on!

No I don't think it does that currently... have you seen it do that? Sure that would be nice though...

BDub, that is a feature we should discuss with the Spark folks. Given how RAM is such a tight commodity, it would be very helpful.

1 Like

@BDub with the CLI you can do it in 2 ways:

  1. Compile using the Spark server and get a bin file

  2. Compile it locally and upload to the core through the cloud

But since you guys are already compiling locally…These 2 functions doesn’t seem that attractive :smiley:

From the [hardware doc][1] i see that the core has:

  • 128KB of Flash memory and
  • 20KB of SRAM

Are all the 128K available for the firmware? Does this relate to the the line I have in my make output?

 text	   data	    bss	    dec	    hex	filename
97044	   2996	  13508	 113548	  1bb8c	core-firmware.elf

How about the ram? How will I notice at runtime that the core is out of memory? Where do I see which error?

[1]: http://docs.spark.io/#/hardware/subsystems-microcontroller[quote=“peekay123, post:10, topic:3329”]
RWB, the web IDE should warn the use when the RAM
[/quote]
RAM or Flash? It confuses me. :blush: I think the compiler can only check the flash size?

Hi @Coffee

The output you are getting needs to be interpreted a bit since text is your program along with all the Spark firmware and goes in flash, but data are the initial values of all your variable that have such values. The data goes in both flash and RAM. The bss is objects that are allocated in RAM that the compiler knows about, but does not include things that are dynamically allocated like automatic variables and calls to new and `malloc’ etc.

So you are using 97044+2996=100040 bytes of flash and 2996+13508=16504 bytes of RAM.

With the most recent firmware changes, an out of RAM for ‘new’ and ‘malloc’ should generate a panic code with a red led flash sequence of SOS (dot-dot-dot dash-dash-dash dot-dot-dot) followed by a number of red flashes–the number tells you the panic code, and then SOS again. Eventually it should reboot.

1 Like