I’m running out of memory on a project (region `APP_FLASH’ overflowed) and would like to see what is taking up the most space (and might even be unused). The project is based on a custom board with a P1 module. Searching the Guide/Docs returns no solutions, so I wonder - how can I profile a project?
The only way I know that can give me an idea of how much memory I use requires that I uncomment all the P1-specific constants and then use this to compile and get a very general memory analysis.
particle compile photon
I can compile this using:
particle flash (My P1's DeviceID)
that only produces the file in the cloud and flashes it to the device. I have seen mentions of the “arm-none-eabi-size” tool and I have it installed, but it seems it requires the .elf file. Is there a way for me to keep the compiled files, such as the ELF for analysis?
First thing I stumble upon is what version of GCC I should use. The guide says version 4.9,. but the latest version available from ARM is 6.0. I could of course install the older one, but is this because it’s required or is the Guide just outdated @peekay123 ?
So I got this to work. I overcomplicated things by trying to install the latest GCC. Using Brew for installing it all went very smooth. I can now compile the firmware using the latest 0.6.2 release branch and I copied my project to the user apps directory. It compiles well after altering all the file paths in my project and I get an output like this:
text data bss dec hex filename
5340 8 1604 6952 1b28 /myOut/myProject.elf
That output is quite a surprise? This is what I got back from the cloud compiler for the same project:
text data bss dec hex filename
115868 112 3176 119156 1d174 /workspace/target/workspace.elf
What am I missing here? Shouldn’t building locally produce a somewhat similar output? Do I have to “flatten” the project structure in some way for my files to compile correctly? I’m compiling using this command in the “main” folder:
make APP=myProject PLATFORM_ID=8 TARGET_DIR=…/myOut/
Based on the questions I got to correct the inter-linked files in my project, it sleems like the compiler found all my files as it should, but the output looks very incorrect. This is the output from arm-none-eabi-size -A myOut/myProject.elf
GCC 5.3.1 is the latest supported and there are a few issues in the firmware repo regarding 5.4.1 and 6.x. to be worked out still. As always, we welcome PRs!
I now have gcc 4.9 installed and I’m still not getting a local compile that actually adds all my files. The output is very far from the memory info that I get back from doing “particle compile p1”.
I’m compiling by standing in the directory “spark/firmware/modules/” repository (from github) using the command:
make all PLATFORM=p1 APPDIR=../../../myProject
The compile looks to go fine. I see no errors, just that a lot of files are being linked in:
Building cpp file: ../../../myProject//src/InputManager.cpp
Invoking: ARM GCC CPP Compiler
mkdir -p ../build/target/user/platform-8-m/myProject/src/
The final output does not include any of the classes that are from my project (such as InputManager) and I get the following output:
text data bss dec hex filename
6476 8 1780 8264 2048 ../../../myProject/target/myProject.elf
When compiling from cloud, the output looks like this:
text data bss dec hex filename
125276 5536 8876 139688 221a8 /workspace/target/workspace.elf
So… Something is missing I presume, but I have no idea where to dig further and I’m really at the memory limits so I need to resolve this now. Any pointers much appreciated!
@jenschr, the local compile output is correct since your code is being compiled for modular firmware. As such, only your application code and any extra code/libraries not found in the system firmware will take up space. This is the beauty of having modular firmware!
The cloud compile, however, seems to put out stats for a monolithic build and seem to use values for the old Spark Core. I would not depend on these numbers. BTW, this is an old issue raised with Particle.
@peekay123 Ok, so then I have this working but I can’t seem to make any sense of the output from arm-none-eabi-size -A ./spark/firmware/myProject/target/myProject.elf:
If I understood this correctly, the -A should give me a map of how much memory each class is taking up. The output above just doesn’t make sense to me.
If omitting the -A I get this output:
text data bss dec hex filename
6476 8 1780 8264 2048 /spark/firmware/myProject/target/myProject.elf
If I’m indeed only using 6476 out of 108000 bytes, why am I then running into memory issues with APP_FLASH overflowing? My .BIN file is just 6kb, but I’m definitely well above that. Hmmm…
@jenschr, I believe you need to compile from the “main” directory whereas “modules” is used for compiling the 3-part system firmware (though not sure it makes any difference). I have no idea why you would have APP_FLASH issues. Try compiling from “main” and flashing directly flashing of your bin file via DFU.
make clean all PLATFORM=p1 APPDIR=../../../myProject program-dfu
I flashed it now, and surely enough - the 6Kb .bin file did not contain the same as the one that is 101Kb
It’s basically flashed with nothing according to the output of “arm-none-eabi-size -A”.
What I should see is a listing of all the classes used in my project and how much they each take up. Right?
Looking at line 3 in the output file (with the “rm -f”) I can see .o files for all the classes in my project, so I’m pretty sure it’s all there. All these files have both a header (.h) and a body (.cpp) file. The files that are not listed as being in the “src” folder are in the “lib” folder. “lib” and “src” are both folders on the root level in my project directory (standard Particle project setup).
These are the files listed and I can’t find any of them in the output from arm-none-eabi-size:
@jenschr, does you code make calls to the libraries? The object files are always produced but the linker will only bring in the code that is used in the final .bin file.