How much Spark Core memory is available in practice?

How much memory is available to me for my own code (.ino file), and how much is available for libraries/frameworks for me to make use of?

I’ve already pored over the docs, for instance the link below that says 128KB internal flash memory plus 2MB external flash memory. And “the rest of the memory space is available to the user,” and though it doesn’t specify how much is available to the users, I’ve seen varying estimates in the forums.

But even with those – I don’t understand how much of the libraries have to be held in internal memory? Are the unused portions of those libraries shunted to external?

I simply want to know:

  1. What’s the size limit that I have to deal with for any libraries I want to use?
  2. What’s the size limit for my application.ino file?
  3. Can I expand this by adding a memory card or the like? If so: Does that let me use bigger library files?

(Specifically why I’m asking this: I’m trying to have my Core parse a JSON feed so it can make neopixel LEDs glow specific colors to convey a few train/bus arrival times. There are various libraries available to make parsing JSON>c++ easier, but I need to determine whether I have enough memory to accomodate them.)

Thanks!

http://docs.spark.io/hardware/#spark-core-datasheet-memory-mapping

@savage
You have about 35K of Flash space for your user application. Any system libraries that you reference in your code may already be included if they were used in the firmware.

As for SRAM a good practice is to leave at least 5K SRAM available for the stack & memory heap.

I found a small json library JSMN from https://bitbucket.org/zserge/jsmn/wiki/Home, I have not checked to see what's already been ported in the Spark IDE Libraries

UPDATE - it looks like the JSMN is available in the web IDE.

1 Like

Great to know, thanks,

Where may I find a list of which libraries are already available (and therefore don’t have to be uploaded separately with my application)?

PS: Spark: It would be great to have a command/feature that would provide clearer visibility into this: How much of each sort of space is left on my core, highlight any libraries I’m including extraneously that are already available, etc.

@savage When you are logged into the Web IDE you can get a list of libraries available by selecting the ‘libraries’ button

You can identify how much space is used for your applicaiton after you compile your application by clicking on the ‘circle-i’ button at the bottom of the screen

Hmm… So back to the same confusion: How much memory do I have for those libraries, vs. the libraries I add? Are those included in the firmware?

As for the memory command: that only works in the Web IDE. I develop locally and use my own libraries and multiple files, so that doesn’t work for this… A CLI command for that (or even showing that as part of the compile output) would be great.

Thanks

Some things that are libraries on Arduino are built-in on Spark, like wire, TCPClient, etc. See the doc for details.

The above size percentage numbers are a filtered version of the info from the build on the line above that you should be seeing when you run make assuming you really meant develop locally (and not using the cloud-based CLI compiler). There is a separate make target for the size so just do make size in the build directory to see the raw numbers. If you are using the cloud based CLI, I am sure there is a way but I don’t use it much myself so someone else will have to answer that.

Only the author of a particular library knows how much RAM is uses and even then your settings when you use a library can dramatically change the memory usage.

So I don’t think there is a way to know a priori how much RAM a library will use; you just have to try it.

2 Likes

@savage As I said earlier, you have 35K flash available for your user application. That will be your application and any libraries you add to support your application.

The CLI command spark compile source_dir will give you the necessary information to determine what flash and memory is used.

text+data = Flash used
data+bss = SRAM used

2 Likes

OK, that’s helpful. Is sounds like my best approach is to try adding/including any external library I want to use and compiling (spark compile source_dir) to check size, before trying to build anything with it…

And yes BTW to clarify, I meant to say that I build locally and compile/upload via the CLI (not develop completely locally).

@savage There are three ways to compile your application.

  1. use the Web IDE to write your application, include your libraries, verify the compile and then either download a firmware file to flash locally, or use OTA to program the firmware onto your core.
  2. write your application locally, get your libraries from github, download them and use spark compile source_dir to compile. Then flash the firmware.bin file using spark flash --usb firmware.bin
  3. download the source to the spark firmware locally, write your application, add your libraries, compile locally and use dfu-util or spark flash to program the firmware onto your core.

So it sounds like you are using method #2

Nope, I’m using #3. ish. That is: #3 but compiling via the cloud based CLI, not compiling locally.

I tried #2 for a while but found the button-pushing-sequence requirement each time I updated via USB more difficult and time-consuming than waiting the ~30 seconds per update to use the cloud method. (Especially now that my Core is embedded in an object.)

Hi @savage

What you are describing at 3-ish is really @mtnscott #2 above. Using the CLI you can choose a variety of ways to flash your code after compiling and linking in the cloud including over-the-air, spark flash cli command, and using dfu-util directly.

If you are not running gcc on your computer usually using make, you are not “building locally” since build means compile and link. Just a terminology thing.

2 Likes

In case of (RUNTIME)RAM this questions is a lot more complicated.
During Runtime the Memory (RAM) is needed for two parts:

  1. Dynamicly allocated RAM
    normally allocated using Malloc or a Constructor, if you are using OOP Paterns.
  2. Stack
    Used during every call to every function and holds EVERY variable defined inside a function. This memory is automatic deleted if the fucntion ends.
    These two memory's are fighting against each other, because dynamic Memory starts right behind you static variables and grows upwards the memory, the stack starts on Top and grows downwards. If these two parts overlap -> this is your "out of memory" HARDERROR

Based on the specific features of the Core the memory needed for the spark to run outside your Loop Function is based on the current situation. If the core needs to reconnect and handshake with the cloud, this can be 4000 byte or more (only for stack). So the answer to "how much memory is needed" goes down to the complex questions of "how much memory and stack is needed on any time inside and outside of the Loop Function, for any state of the cloud connection or any other liabry you use.

1 Like

@bko: OK, then make it a #2 but with CLI, hold the USB. And a shot of Jameson neat.

@softmeter: Ai, better make that two whiskey shots. Thanks for explaining. I guess I will compile any library I consider included with current state of the code to see that it’s well beneath the available memory limit, and if it is then write my functions and cross my fingers that my calls/usage of the libraries doesn’t push it beyond the limits.

Thanks guys for taking the time to explain all this.

1 Like

@bko Are the libraries included with spark always included in the bin even if I don’t use any of the functions? For example, if I’m not using Wire, do I have a smaller footprint?

Hi @matt_ri

There is some reduction in memory–for instance if you are not using TCPClient you don’t get the buffer, but the code is still there. So right now, if you need to have the absolute most room for your code and the most RAM, you have to compile locally (run gcc on your computer) and edit the sources to remove stuff you are not using. The other benefit of compiling locally is you can easily get a report that shows what is in memory and how large it is.

In the future I know that @mdma and the rest of the Spark team are committed to improving all of this.

1 Like

Thanks @bko

Thanks @bko