Fix for #include "avr/pgmspace.h"

PGM functions are not available outside of the AVR world, I’ve used the following as a replacement: can anybody validate that?

#define PROGMEM
#define pgm_read_byte(x) (*(x))
#define pgm_read_word(x) (*(x))
#define pgm_read_float(x) (*(x))

It’s a little more complicated than that unfortunately:

I was hoping the Spark Team already had some helper functions for reading/writing FLASH. It’s typically used in Arduinoland for saving RAM. However here in Sparkland, we’ve got a fair amount of RAM. That said, someone is going to need it someday and I’d like to see it implemented.

For now in libraries that use pgm_read_byte() I’ve just been converting the FLASH tables to int arrays.

1 Like

Looks like reading is not as complicated as I thought. From the main.cpp, a function that pulls the Device ID out of Flash memory.
https://github.com/spark/core-firmware/blob/master/src/main.cpp#L470-L493

ID’s are defined as 32bit address pointers.

And another way to do it with memcpy()
https://github.com/spark/core-firmware/blob/master/src/wifi_credentials_reader.cpp#L77-L94

Getting the data into flash at compile time shouldn’t be too hard, as long as we know what memory spaces are available to us. Dynamically doing it at runtime will be more difficult. If anyone can offer more info, please add to the conversation any time!

Is this usable? Let’s get (non)volatile

@onkie that’s usable for sure, but different memory. That example is to read and write data on the external 2MB flash memory chip, whereas the pgm() functions are on-board program memory flash.

@rlogiacco - you don’t need the values to be stored in flash, and have enough space to store them in RAM, then your solution is good. I’ve done the same until there is a suitable alternative available.

PROGMEM is only needed on AVR because the AVR has separate program memory and data memory spaces. On STM32, there is no need for PROGMEM because the memory space is unified.

Declare the memory you wish to be placed only in flash as static const and the linker will place it only in flash. No special macro is needed to read the memory address in STM32 flash.

Your macro will work but does not place the allocation only in flash - the c startup will end up duplicating the memory in RAM.

Yes that’s right static const. I realized this after writing my post. When I was porting the arduino code, I did some tests and the compiler does correctly place the data in program flash without duplicating it in ram.

So perhaps this should be part of the spark wiring headers so that it’s easy to maintain compatibility with arduino libs.

Here is what Lady Ada over at Adafruit did in her libraries:

#ifdef __AVR__
 #include <avr/pgmspace.h>
#else
 #define pgm_read_byte(addr) (*(const unsigned char *)(addr))
#endif

I have a pull request in around these issues. Please take a look and support it if you think its the best solution! https://github.com/spark/core-firmware/pull/187

Just a quick clarification here: my goal wasn’t to provide access to program memory, but to provide a workaround for Arduino libraries using PROGMEM directive which is invalid outside of the AVR world.
Nonetheless I’m glad we are here adding additional functionalities to the Spark Core and as such I’m glad you guys keep working on this.
In my humble opinion the two scenarios should be both considered: fix the libraries compilation errors AND provide access to program memory.

Regarding the former I still think my little header should be enough in many cases, but I don’t have enough C++ skills to be 100% sure.

rlogiacco, the Spark Core does not have “program” memory like the Arduino. All of the STM32 flash is considered program memory. So PROGMEM on the Spark is basically null since it is not needed.

I think @peekay123 is exactly right here and the approach I have taken is to:

#define PROGMEM
#define F(X) (X)

That way the AVR stuff just gets out of the way. There is a little bit of subtle stuff happening with printing in the pull request from jjrosent that I have not needed yet, but otherwise I think is the approach everyone is taking currently, and asking the Spark team to take as well.

i downloaded this file from here :
https://github.com/mikalhart/galileo-Pgmspace.h/
next i edit the line to:
#include <pgmspace.h>
is working…

1 Like

Actually, if you are building for a target version >= 0.6.1 you would only need to add #include "Arduino.h" which is now a stock feature for particle devices and supports a lot more than just pgmspace support :wink:

If you were only looking for pgmspace support you didn’t have to do anything for years now, since Particle.h did already provide compatibility macros. Just the really early system versions didn’t (hence the last post here from 2014).

2 Likes