Hi,
i’ve had a hard time finding my way to do complex things with the Spark, cause of the very limited amount of memory.
I want to share my findings with you.
There a two types of dynamical Memory (HEAP and Stack). The Heap starts on the bottom of the available memory (and gos up), the Stack starts from the Top (and goes down).
If these two borders come close to each other, and cross -> this is your “out of memory SOS Blink”
If you are not using dynamic allocation, some libarys and the Core connection will do.
The Stack, on the other side, contains any local variable (inside a function). So every function call (which calls other functions) are filling the stack. If the function ends, the stack will be freed automaticly.
The following function will measure the distance in Bytes between these two borders.
extern "C" char *sbrk(int i);
uint32_t freeMemoryAvailable(void)
{
register char * current_stack_pointer asm ("sp");
char *heapend=sbrk(0);
return (current_stack_pointer - heapend);
}
In our SparkSystem (including cloud connection httpclient etc.) we are not able to call Cloud functions if this value is below 3000 bytes.
if you have a long running system, you can also publish these values as an Event.
Are there any other tricks to measure memory during runtime?