How to use standard C stream IO like stdout

Hi,

ages ago I asked about using the standard C functions for IO, like puts() or so? You show me some examples for the Arduino way, but I want to use all my old codes, so I like the stdlib things more.

After some searching I found the newlib_stubs.c file, which seem to implement all the basic functions for that. But how can I enable it? I found this inside:

#ifdef __CS_SOURCERYGXX_REV__
#if USE_UART_FOR_STDIO
#ifndef STDOUT_USART
#define STDOUT_USART 2
#endif

So I think I have to define CS_SOURCERYGXX_REV and the STDOUT_USART somewhere, but how and where?
If I simply try to use “puts()”, I get tons of errors concerning all the _write, _seek, and so on functions.

BTW: Are the newlib sources somewhere in the local build environment of the Spark? And does anyone know how to set up a naked standard C project without using any of the special spark stuff. If I don’t want to use the Spark cloud at all, that would surely safe much memory for my own project, doesn’t it?

Thank you
Thorsten

Where do you want the output from puts() etc… do go to? Unlike a normal command line app (which C stdio was modeled on) there are limited options on an embedded device for where to send the stdio streams.

We don’t have the newlib sources - they are part of arm-gcc - you can search for that and I’m sure you’ll find them.

Sure, it is possible to avoid using the spark cloud, or the entire spark firmware if that’s what you need. But then you’d be coding against bare metal. It’s not for the casual coder - something as simple as digitalWrite(A0) turns into a whole bunch of code. You also have to manage the linker, interrupt vectors, startup code (memory initialization) and other aspects. So possible, but not simple.

Tell us more about the project you have in mind and we can guide you to an optimal approach.

Hi,

Sorry the answer is a bit lengthy.

I don’t want to code against bare metal, I would like to use the TCP/IP stack and all the IO-facilities supllied by the standard C library, like I would do on a command line tool on Linux or something. My problem with that Arduino-like stuff is, it isn’t standard, so you have to reinvent the wheel or maybe search for equivalent from the Arduino world.

A good example is a FAT32 SD card library. Yes, there is one for the spark. But it can only open one file at a time and don’t support long filenames. There are several libraries out there which do the job, but the usually have a connection to the C stream IO system. You cannot simply use in on the spark/Arduino. Not to talk about the source which uses file IO. There is a standard: fopen, fread, fclose, etc. Why does Arduino reinvent it and even fails do standardize it.

So I would prefer standard C file or stream IO, a standard socket interface for IP and so on.
Where should stdout go to? Simle: where you like. At program start define where it should go, UART or USB-Uart or even telnet. It seems to be there in the spark code as I found in that newlib_stubs file, but I don’t know how to use it.
Look at NutOS, that would be my favorite, but it hasn’t been ported to Spark yet.

My project(s) are some simple devices like light controller or Webradio or something which I can control via an integrated web server. It should NOT be connected to the outside world. I don’t want a light switch which isn’t working if there is a missing internet connection. I simply don’t need remote access to my stuff. And I don’t want to run a 7x24 server fot such simple tasks.
Currently I build a talking LED matrix clock which also can play MP3, from SD and later maybe from the Internet. I want to set it up using an internal Webserver or a simple telnet commandline interface. I implemented telnet using the telnet example, but it is extremely slow, it looks like a 2400Bd modem connection. My Ethernut does that 5 times faster … on an AVR mega 128!

I understand, my statements may sound a bit agressive to the Arduino people. If you started programming on Arduino it maybe simple and nice. But if you are an old fashioned C programmer who already have difficulties to understand all that classes stuff, it is very difficult. I like the small footprint and powerfull hardware plattform of the Spark, and I sometimes really asked me if it would be easier for me to start from scratch only using the network stack and nothing more. But I don’t know what I really need to not brick my Spark an not be able to flash it.

So last, I hope I not offended someone. I really do not want this.

Thank you for understanding
Thorsten

Hi @Bluescreen

I think what you want to do is perfectly reasonable, but you need to recognize that it is not “impedance matched” to what Spark is all about, so you will have some frustrations here. For example, as @mdma said, where should puts() send its output? The entire infrastructure you are assuming when you write puts() has not been written for this platform–there is no kernel.

Have you tried a Raspberry Pi? It seems like a much better fit to your mental programming model.

Also there is/was a NuttX port for the core:

The entire infrastructure you are assuming when you write puts() has not been written for this platform–there is no kernel

I don’t agree to that. The stream IO from C is not in the kernel, it is in the newlib or whatelse C runtime lib is used. All you have to do is attach a simple character input outpu routine to it. Maybe it is somehow disabled in some compiler options.

If you use gcc out of the box, write a simple hello world and attach the character out routine lets say to the UART to newlib it works on other systems, like a simple AVR based microcontroller board.

// a simple hello world
#include <stdio.h>
#include <stdlib>
void main(void)
{
  puts("hello world");
  while(1);
}

I don’t know if NuttX is the same as NutOS, but it sounds nice. I will check it.

A full featured Linux I think is oversized. It needs too long to boot and has much features You do not need. And it is not realtime. So simply multiplex a LED matrix without flickering is not possible or even very complicated to implement as a kernel device.

I like my WinAVR. This is the way I like to programm.

I for now gave up using printf and stuff like that. But it does not help. I want to implement a FAT32 library with long filename support. I can compile it, but it also relies on basic routines like _write, _read, _stat, etc. Same problem as with printf. I searched the whole code for standard IO functions, even removed the includes to stdio.h. But it does not help.

All these needed functions are defined in the mentioned newlib-stubs.c file, which is provided in the spark’s source files. But it seems not not to be used. I also cannot compile C files, unless I rename them to cpp. Even if I add them in the makefile like this:

CSRC += $(TARGET_FAT32_SRC_PATH)/sd.c

It seems that the Spark’s local built system simply ignores C files, maybe also that newlib_stubs.c file.

I think I have to give up. I don’t understand this system. Maybe I have to switch to another platform. Or try to find a bare metal tutorial for ARM and ignore all that Spark stuff. :frowning:

Frustrated
Thorsten

The build system does support C files - there are plenty of C files in the firmware sources. Also, I suggest you use the feature/hal branch when compiling locally - the make system has some key improvements there.

If you want to hook the very low-level _write(), _read() etc… functions then you should try adding those to your code since sources in newlib_stubs.cpp is commented out. At first leave them out - you should at least see linker errors then complaining that these functions don’t exist. Implementing them in your own module, you are then free to manage the stdio streams as you need.

1 Like

I now added the _write and so on functions to my code, but then make complains about duplicate symbols.

Withou my own _write code I get errors like this

c:/program files/gnu tools arm embedded/4.8 2014q2/bin/../lib/gcc/arm-none-eabi/4.8.4/../../../../arm-none-eabi/lib/armv7-m\libc_s.a(lib_a-writer.o): In function `_write_r':
writer.c:(.text._write_r+0x10): undefined reference to `_write'

With my code I get

./obj/applications/SDCardTest/stubs.o: In function `_write(int, char*, int)':
D:\Entwickl\Spark\core-firmware\build/../applications/SDCardTest/stubs.cpp:101: multiple definition of `_write(int, char*, int)'
./obj/applications/SDCardTest/application.o:D:\Entwickl\Spark\core-firmware\build/../applications/SDCardTest/stubs.cpp:101: first defined here

Why? First it’s not there, then it is doubled. Ok, since the makefile scans for all c and cpp file it seems to compile my stubs.c once for itself, and then because I included it in my application.cpp. So I put the _write function directly in my application.cpp and remove my stubs.cpp. But then the linker does not find my _write function, like it was without my own supplied function.

If I have to give up getting these functions implemented, how can I find out which part of my FAT32 library uses these functions. I don’t see any commands which should use them.

Ok, now I get it working. I added these to the newlib_stubs.cpp file:

#define __CS_SOURCERYGXX_REV__
#define USE_UART_FOR_STDIO 1

I don’t know what the first line means, but it excluded all the write… functions. I added that before to the main makefile using
CFLAGS += -D__CS_SOURCERYGXX_REV
_ but that didn’t do it. Where is the right way to add that?

Another problem was the missing functions _fstat and _isatty in that file. I implemented empty functions for that and voila, it works.

Now I am happy to debug my FAT32 library.

Mentor Graphics acquired Code Sourcery a few years back. You could try poking around here to learn more:

It looks like you flipped a switch in newlib that makes it think you are using those functions from the Code Sourcery library

If you put the _write etc… functions in application.cpp, then you would have to declare them as extern "C" to be sure they got the correct linkage, otherwise they will be name-mangled as C++ functions, and the linker won’t find them.

As you found out, you don’t need to manually add additional sources to the makefiles, they are included automatically, and if you do, they will be linked twice.

Glad to hear you got it working. :slight_smile:

I’ve just started to look into this as well to have better portability of code I pull in. I am using fatfs and like to add a small shim on top of that. I added a newlib_stdio.cpp to my app where I implement _read and _write (both inside extern “C”). But I get linker issues. The weird thing is that I can’t find the related sources. And also looking at github, there are checked-in binaries instead:

/home/user/project/src/newlib_stdio.cpp:10: multiple definition of `_read'
../hal/src/photon/lib/common_GCC.a(stdio_newlib.o):/Users/mat1/dev/spark/photon-wiced/WICED/platform/GCC/stdio_newlib.c:115: first defined here
../build/target/user/platform-6/src//libuser.a(newlib_stdio.o): In function `_write':
/home/user/project/src/newlib_stdio.cpp:33: multiple definition of `_write'
../hal/src/photon/lib/common_GCC.a(stdio_newlib.o):/Users/mat1/dev/spark/photon-wiced/WICED/platform/GCC/stdio_newlib.c:135: first defined here
collect2: error: ld returned 1 exit status

I’d like to see the source that where used to build common_GCC.a and ideally, just have them in the repo instead.

Also just to be sure, can we follow this approach?


Or where can I see the exact prototypes?