I am creating, what I think is, a large array

I was working on building a 3D array, and wondered if I was making one that would be too big for the Photon or Argon to use. It will be 288 * 5 * 7 = 10080 data points.

Is there a good way to know if I will be getting close to the upper limits of what the Photon or Argon can do?

    //Declare variables for daily array average.
    const int NumReadings = 287;  //This is how many 5 minute intervals are in a day (60_min / 5_min)*24_hr = 288(index 287 for 0 position)
    const int NumAverages = 4;      // How many previous days of data to average(ie. 5 sundays)(index 4 for 0 position)
    const int NumDays = 6;      //How many day of the week there are.
    int ReadIndex = 0;      //variable to keep track of the index of the current reading.
    int AveIndex = 0;
    int DayIndex = 0;
    int Intervals[NumReadings][NumAverages][NumDays];   //Day's Schedule Intervals
    int Total = 0;
    int LearnedBias = 0;

Actually no, it’s 287 * 4 * 6 fields (you are not setting the max index of the array but the field count - indexes ranging 0 .. (field_count - 1)) each 4 byte (int) hence your array is 28552 byte or 26.9KB.
But yes, that is large and the build stats you get at the end of a build does give you the number of bytes require including all your other global/static variables.
But for an estimate whether this brings you close to the limit you also need to consider what dynamic/automatic demands your code will produce at runtime.

System.freeMemory() will give you a feeling of how your code behaves over time.

3 Likes