Code portability - Uno, Leonardo, SparkIO - and language definition

I want to write code which is portable between different types of Arduino, Arduino-workalikes, Arduino-compatibles, and Arduino-similar systems. E.g. I would like to code as in the following example but I do not know what are the pre-#defined pre-compiler directives. Anyone?

    #ifndef SPARKIO
    #include <mylib.h>
    #endif
    .
    .
    .
    #ifdef LEONARDO
    #define VOLTAGE 5.0
    int LEDPIN=13
    #endif
    #ifdef SPARKIO
    #define VOLTAGE 3.3
    int LEDPIN=d7
    #endif

This thread might help:

The one you want for Spark is:

#define SPARK_CORE (1)

for version 1 of the core. Arduino has a similar version number but I don’t know about the flavors of Arduino.

2 Likes

OK, thanks for the link to that thread. There are other differences between Arduino-similar systems. Eg. there’s a 10 bit A2D on most Arduinos but it’s 12 bit on the Spark Core. Now, I could code

#ifdef SPARK_CORE
#define A2D_BITS 12
#else
#define A2D_BITS 10
#endif

but I guess a constant like that is already defined. Where can I find a list of the #define’d macros and constants available to us? And it would be nice if the names of these were the same as in the Arduino environment, if possible.

I think you need to look at the source:

I don’t know how these would line up against say Arduino since very few #define's are meant to be public interfaces such as SPARK_CORE and ARDUINO.

Well, I could look at the source but that could be quite an exercise as there are likely to be many #define’s and only some of those are meant for the application programmer’s use - they could fall out of use, become deprecated, or just disappear overnight. I had hoped there might be a document which said what #define’s were usually commonly there for our use.

E.g,. MAX_ULONG is defined in the std C programming environment and works for the Spark Core too. I don’t therefore have to programme as follows, OR DO I?, if I want to write portable code, which is the subject of this thread, after all. Where is the doc that says I can use MAX_ULONG?

#ifndef MAX_ULONG
  #ifdef SPARK_CORE
    #define MAX_ULONG 4148484848 // or whatever
  #else
    #error MAX_ULONG is not defined
  #endif  
#endif

But, as I say, I don’t need to program like that because I can rely on <limits.h> to supply that #define. Or can I? And if I can, why can’t I rely upon a #define to tell me A2D_BITS is 12? Maybe I can but I don’t know. Hence this thread.

The C language definition for an architecture lays out exactly which #defines are available. Usually. Where is that definition for the Spark Core C-like programming language?

Another way of asking my question is this (because they would [should!] contain the #defines I want to know about). Which #include files are available to me? And the use of which is not disparaged? If there is a list, and I have missed it, please advise.

Hi @psb777

I don’t think what you are looking for exists for either Arduino or Spark. All of the C/C++ #defines such as in limits.h are there of course but you are looking for board/architecture specific stuff like the ADC bits per sample. Frankly I don’t think any of the #defines in core-firmware except for SPARK_CORE should be relied upon for the reasons you state above, but that is a decision only the programmer can make.

Sometimes within a family of boards, like say the TI DSPs, there will be board support packages that define common things that you can rely on within that family, but if you moved to say a Raspberry Pi, these would not be there.

Also I have never found that depending on limits.h #define's a good way to go. I generally use the types in stdint.h like uint8_t and int16_t if I want a particular size and if I need decisions about sizes, I use sizeof() but I suppose that comes down to personal preferences.

1 Like

OK, once again very helpful. But it seems there is information not disclosed or not made readily available to Spark Core application programmers. E.g. you and @peekay123 give good advice at https://community.spark.io/t/round-function-error-web-ide-solved/3781?u=psb777 to #include <math.h> so as to make the ATM32 math library available. Perhaps the web page naming the libraries and functions which are already available exists somewhere (and perhaps in plain view - I can be blind sometimes) but the comprehensive list is NOT at http://docs.spark.io/#/firmware/data-and-control-spark-publish - on that page math.h is not mentioned. Is there a list of all the functions and libraries available as standard for our use?

See also https://community.spark.io/t/conditionally-include-spark-includes-define-spark-core-isnt-viable-what-is/4228

We build with the arm-none-eabi-gcc toolchain, which includes Newlib. If you click on Docs on the left side of that page, you’ll have links to either the standard library or math documentation. Any of the headers listed there can be included.

Good idea to document this @psb777. I’d love for you to submit a pull request to the docs repo adding it to the place you think would be most intuitive and helpful.

My stubborn aversion to github might now becoming apparent but that isn’t it, really not. The situation is that I cannot spend the time reading the code to work out what the docs should say. [This is the wrong way around, in an ideal world, anyway. It does seem to me that some Spark code seems to exist before any docs!] But I can see limitations in the current docs and I hope it is useful if they’re pointed out.

In this case why cannot the newlib docs at least be pointed to from Spark docs? Even better, can we please get rid of the supposedly friendly list of functions and provide instead formal definitions of them, with proper function prototypes and proper parameter defns for each param and the exit and failure modes and the corresponding return codes. And info as to whether errno is set, and to what. All that is available for newlib as a simple cut and paste. For example, this is what a Spark application programmer needs to know about atof().

https://sourceware.org/newlib/libc.html#atof

Have you looked at the Arduino wiring docs:

http://arduino.cc/en/Reference/HomePage

I think something more like this is what a lot of people who come to Spark are looking for. I think the newlib doc is great for experience people (who probably don’t need it) but would be really, really intimidating for most of the people who come here.

You have a point but only that a gentle intro to anything is needed. We need a “Spark Core for Dummies” book. But you carry your argument too far. Dumbing down? I’m not in favour of it when it creates a barrier between those who know and those who don’t. That’s exactly what we have here. A very capable platform, brilliant, of which many features remain impossible to learn because the docs are inadequate. That newlib is what is used is hidden, not in the docs! That there is a whole host of available functions is hidden. But not to you, because you read the code. And, slowly and gradually, not to me. BUT what a struggle to get here! And no newbie programmer, however capable or ambitious, has much hope of approaching your level of knowledge, of ever learning what is not documented!

I cannot believe anyone thinks an inaccurate flawed incomplete description of a function is helpful to a beginner programmer. My first encounter of the words “prototype” and “synopsis” was back over 35 years ago when perhaps still in shorts I learnt to program by reading the online manual pages of Unix version 6/7. At school they called variables “boxes”. None of my classmates learnt anything.

@psb777 i think you’re expecting the spark guys to document the entire c++ language, you’re forgetting that there’s very little that’s specific to the spark aside from the wiring port (already documented elsewhere) and the spark cloud functions, the rest is plain c++, you can’t seriously expect the docs to include when you should use math.h and what atof() does?!

personally i think the docs already are a bit “for dummies”, very cloud/wiring -centric. the kind of stuff you’re talking about is for more advanced users who should at least have a local copy of the repo’s, then they’d see newlib is mentioned on the 2nd line of the makefile, and its in the src/ directory of the firmware.

2 Likes

No I am not expecting anything other than all the information necessary to be able to take FULL advantage of the Spark Core must be accessible from the Spark Core documentation. Currently not even a pointer to newlib exists there. And a reading of newlib quickly makes you realise some functions likely are not supported, and the re-entrant stuff is likely not relevant. But a basic knowledge of computer science is needed to make those inferences. What is needed is a complete list of the supported functions, libraries and includes. A list of what pre-compiler constants are pre-defined and usable, and a list of what is possible to use but which is deprecated. Argue against that, if you will.

I think it would be a fundamental mistake of yours, but more importantly of the Spark team’s, were you or they expect normal FULL use of the Spark Core to require local builds. This is a cloud platform. The aim should be that well nigh on everything should be possible via the Cloud. One ought to be able to take full advantage of the Spark platform using only a Chromebook or an Internet cafe PC. And what is possible must be easily discernible from the documentation. In fairness I don’t think I have ever seen a Spark team founder/employee say anything different. Although there are a few who do venture to speak for the Spark team, or seem to, who seem to agree with your point of view. So far, well nigh on everything is possible from the cloud. What is lacking is the documentation.

It would be a grave mistake to require reading of the source code so as to work out how to do things, or to work out what is possible. I’ve been using C and Unix and Perl and many other technologies for years and never has anything REQUIRED me to read the source of the Linux kernel, or the source of the gcc compiler, or the source of the Perl interpreter. It is just not reasonable to tell me that line two of such and such a source code file references newlib.

I am deliberately not building locally. My only compromise is that I do use Spark CLI to compile, in the cloud, because I need to keep my source code locally as I need source code control and I would like to and do re-use some of it for the Arduino. I even have my own little string library written in 1994 which works everywhere and also both on Arduino and on Spark unchanged save for a pesky include here or there. That’s real portability, but there was nothing in the Spark docs or referenced by the Spark docs that let me know that strcmp() or index() or malloc() was available for use by my library, for example, and that is the subject of this forum topic.

I don’t want necessarily to be told how to use atof(). I need to know it is there, available for use. Not mentioned in the docs. This is how you like it?

Yes, the doc for atof() should give a prototype, a functional synopsis, a description of the params, the fact that the some or other header file must be included, the error and failure modes. There is a web page in the newlib docs. There is no need for the doc to be copied over. The function needs to be mentioned and a link provided. That’s all.

But only if you want people to know the function is available to be used and how and what to use it for without having to ask the guardians of the knowledge in the forum. Do you?

Also, the entire set of funcs one would expect are not present - only newlib. As far as I know, as I have been told. So it isn’t the case that I should just know that this is C++ and everything is available. Not everything is available.

FWIW: In case you run into issues like I did once upon a code time… since SPARK_CORE is defined in application.h, you can’t use SPARK_CORE in your pre-compiler directives before application.h is included.

I hadn’t tested this again, but I had thought already committed and merged was the change where SPARK_CORE is now defined in the compiler command line and so is accessible without including application.h - or if it is not that one which is pre-defined there is a new one, thus allowing us better portability.

I can’t speak for the Web IDE, but the local build is preceded with the makefile and I don’t see SPARK_CORE defined there:
https://github.com/spark/core-firmware/blob/master/build/makefile

It’s here in the application.h file:
https://github.com/spark/core-firmware/blob/master/inc/application.h#L28-L29

It is pull request #193 in core-firmware and has not been merge yet. You can subscribe to notifications for it on this page:

2 Likes

Thanks @bko, you can bet that I’m subscribed to all :wink: I remember seeing it now.

2 Likes