Limitations on the use of large character arrays?

I have a Photon application running to monitor some data over a longer period of time.
I am using a number of large character arrays for that (8 arrays of 622 characters to be precise).
I assume that there definitely are limitations to the number of large arrays and/or the length of these arrays?
Are these specified somewhere, and if not: who has a good rule of thumb?

The Photon features 128KB RAM of which about 40-50KB the device OS requires for itself to run, leaving you with about 80KB for all the RAM needed (stack, dynamic, static, automatic, …).

The build summary will give you an indication of how much static RAM your code requires but it cannot tell you how much will be used dynamically while running.

To keep an eye on that memory demand you can use System.freeMemory()

1 Like

@Jan_dM 8 x 622 is 4976 bytes - that should not be any problem. If you find that you are short on free memory you could implement 3 x 622 bytes as retained RAM with retained char array[3][622] - in the 2K retained space.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.