Where did all the Flash go?

Well, as many of us will eventually run into - where did all the flash go?

It appears that just compiling an empty application takes up more 67% of the available flash. This seems very excessive. I run out of flash with only 1200 lines of code! Now that last statement is broad, what libraries, where are my strings, … in reality none of that matters, how can just the base Spark firmware take up 75K of the available flash? We only have 110K to work with.

Is there anyway to reduce the amount of flash that is taken up by the firmware?

With the Arduino environment you need to include support libraries for each capability you want to use, for example if you want to leverage 2-wire support you need to include < wire.h >, for SPI you need to include < spi.h > and so on… This limits the size of the resulting binary. With Spark you don’t need to specify ‘wire’ or ‘spi’ so does that mean they are all included by default?

Is this a limitation of the IDE? Would one be able to have more control of this by locally building the firmware?

Hi @mtnscott

With Spark, lots of Arduino-style libraries are available for you to use, but are not necessarily linked into the final binary file if you don’t use them. If the code can’t be reached, the compiler will eliminate it. The Spark protocol takes some room and I believe is always included right now but that could change in the future.

That said, I am sure there is some bloat and compiling locally will give you more control and visibility into just what is in your final binary for both flash and RAM usage.

You still have more flash available than on a typical Arduino UNO.

1 Like

Hi @bko

That's an interesting comment - so I did some research ...

There are three factors that affect how much flash is available for the user application.
Total Flash on the platform, bootloader size, and base firmware size.

Amount of flash on platform (from Arduino IDE boards.txt & Spark docs)

 32K    Arduino UNO, Pro, Mini   (Atmega328)
256K    Arduino Mega 2560        (Atmega2560)
 32K    Arduino Pro Micro        (Atmel32U4)
128K    Spark                    (ST32M)
128K    RFduino                  (nRF51822)

Size of the bootloader (from Arduino IDE boards.txt & Spark docs)

  0      Arduino UNO
 2K      Arduino Pro Mini
 4K      Arduino Mega 2560
 4K      Arduio Pro Micro
20K      Spark
  0      RFduino

Size of an empty setup() and loop() application (rough estimate of base firmware)

466 bytes  Arduino UNO, PRO, Mini, Micro
466 bytes  Arduino Mega 2560
73K bytes  Spark
466 bytes  RFduino

Flash available for user application (available flash - bootloader - base firmware)

31.5K      Arduino UNO
29.5K      Arduino Pro Mini
251.5K     Arduino Mega 2560
27.5K      Adruino Pro Micro
35K        Spark
127.5K     RFduino

So the Spark has slightly more flash available to the user application than several of the Arduino platforms, but significantly less than Arduino Mega and RFduino.

Spark has a lot of potential (SRAM and Processing speed) however with so much code overhead in the support of the cloud it falls into the smaller category of platforms by reducing the flash available for the user application.

SRAM for comparison

 2K        Arduino UNO, PRO Mini
 2.5K      Arduino Pro M
 8K        Arduino MEGA
 20K       Spark
 8K        RFduino

Spark takes the lead with available SRAM, which is probably why it can support a TCP/IP stack. :smile:

I should point out that because of the base firmware on the Spark the user application could consume less firmware because of leveraging any library function already included in the base firmware - in otherwords the user application may take up less space in flash on the Spark vs other platforms due to the presence of library routines already loaded in support of the Spark protocol.

I am hopeful that the hardware abstraction re-factor going may provide an opportunity to reduce the base firmware size. :smile:

2 Likes

Thanks for doing the memory comparison, that’s useful info.

Now, what if you took that Mega, added a WiFi shield, and included the WiFi software libraries in your empty baseline sketch? What happens to the available flash? You’d probably want something that at least comes close to the WLAN refreshing that the Spark does behind the scenes on each loop iteration.

I know the Spark team has already said that they are looking into further modularizing the WiFi and Cloud code in the firmware. So hopefully in the future, if you don’t need the Cloud features, it could leave those out.

Obviously, being able to filter out all WiFi code would save a lot. But if you did that, you might as well use something other than a Spark.

1 Like

What would be awesome would be to add more Flash. Based on the data for the specific processor they selected -

STM32F103 – 72 MHz, up to 1 Mbyte of Flash with motor control, USB and CAN

I think having a separate Flash partition that was protected to support the CC3000 and core WiFi functionality would be a great architecture to add to the Spark-II. Then all you needed was a jump table to get access to the functionality and that would take much less Flash space in the user application.

This is similar to what Nordic does. They keep the Bluetooth stack in protected flash memory outside of the user application memory. You just need some hooks to get into the code. Since you typically don’t mess with the stack often. But you can re-write it when it needs updating.

1 Like