Well, I could look at the source but that could be quite an exercise as there are likely to be many #define’s and only some of those are meant for the application programmer’s use - they could fall out of use, become deprecated, or just disappear overnight. I had hoped there might be a document which said what #define’s were usually commonly there for our use.
E.g,. MAX_ULONG is defined in the std C programming environment and works for the Spark Core too. I don’t therefore have to programme as follows, OR DO I?, if I want to write portable code, which is the subject of this thread, after all. Where is the doc that says I can use MAX_ULONG?
#define MAX_ULONG 4148484848 // or whatever
#error MAX_ULONG is not defined
But, as I say, I don’t need to program like that because I can rely on
<limits.h> to supply that #define. Or can I? And if I can, why can’t I rely upon a #define to tell me A2D_BITS is 12? Maybe I can but I don’t know. Hence this thread.
The C language definition for an architecture lays out exactly which #defines are available. Usually. Where is that definition for the Spark Core C-like programming language?
Another way of asking my question is this (because they would [should!] contain the #defines I want to know about). Which #include files are available to me? And the use of which is not disparaged? If there is a list, and I have missed it, please advise.