The functions like sFLASH_ReadBuffer and sFLASH_WriteBuffer for working with the external flash chip are defined in the core-common-lib, here:
Additionally, you may in some cases have to erase a sector using sFLASH_EraseSector(uint32_t SectorAddr) before writing to it. I encountered this when working on the thermostat. Most of the rest of the following commit is incidental—the thing that made it work was the addition of sFLASH_EraseSector before writing:
The size of the erasable sectors is 4kB = 0x1000. So e.g., if you erase the sector at 0x80000 like I do in the thermostat code, that erases everything from 0x80000 to 0x80FFF. The next sector begins at 0x81000.
. soldered a standard 8-pin header to an SD card adapter sold with a microSD, luckily the pin spacing of SD adapter allows it
. inserted an 4G microSD card in the adapter
. connected it via SPI to the core with basic breadboard wires
. reused/adapted/optimized sample code from the web on SPI SD card
After some effort in debugging/optimizing (no DMA yet), I can read (at 3Mbits/sec) and write (at 1.5Mbits/sec) successfully the SD card by 512b sector, so my core has now 4G of storage for very little extra hardware
Because we have multiple sources of non-volatile storage (STM32 internal flash, SST25VF016B flash chip, CC3000 nvmem), we’ve debated exactly what the EEPROM library should do. We’re leaning toward using the CC3000 because it’s the smallest space with the highest endurance.
Don’t know when we’ll get to it, but, of course, we accept pull requests. Here’s what EEPROM should do:
So we have to erase a 4kB block even if we want to change just 1 byte in that sector? Is there a more efficient way of changing 1 byte in the sector than loading it into memory, changing it, then writing the whole sector back?
My work on the spark eeprom/flash library is taking a pause while I’m busy with other commitments but I have coded a system where the number of erases is significantly less than the number of writes (say 8x less). Also the erase is only needed if a write attempts to change a 0 back to a 1. So writing 0xFF, then 0xF0, then 0x40 then 0x00 to the same location would not require any erases since no 0 bits are turned back into a 1.
Other alternatives include opening a stream so that the flash library can appropriately co-ordinate erases so that client code doesn’t have to worry about that.
Finally, appropriate wear-levelling algorithms can help ensure that the erases are distributed throughout the flash rather than having one 4k block being continually erased for updates to a single address.
@zachary
I have used the functions sFLASH_ReadBuffer and sFLASH_WriteBuffer to work with external flash memory. The problem is the code works only when the numer of bytes to write is less than 300, it is wierd and the thing is I have an array of size 20KB+ that I want to store in the flash memory.
The procedure I am following is to read/write from external Flash is.
Setup()
Erase a block, which erases 4096 bytes
sFLASH_WriteBuffer size 300 bytes
Loop()
Read and Serial print 4 bytes from some address to check if the data is written correctly.
Writing 300 bytes to flash memory is working and I am reading the correct values but when I increase it to like 400, 1000bytes, the Spark core blinks blue and nothing gets printed on the Serial connection.
Do you have any idea on how I can solve this problem?
Sorry if this is a silly question, but I’m wondering if this just could be a problem of allocating too much ram at once? Are you allocating a solid 400 byte chunk?
I have the array initialized with 1000bytes and I am just loading 400 of it. It is compiling and loading to the Spark Core corectly but the core flashs blue and no output on the serial connection.
I saw in one of the documentations that flashing blue indicates connection to cloud failure. May be I am messing with the Wifi module of CC3000.
If the problem is allocating too much ram at once, is there a way around to load big array to the external flash memory?
Hmm, is it just a problem of writing 400 bytes at once, can you write more in smaller chunks? Any chance you could share your core and we could try to find the issue from there?
In addition to the RAM limits that @Dave mentions, I think you could be having other troubles too. If you are just trying to get read-only data into the external flash so your program can use it, I would try dfu-util to just load it over the USB in bootloader mode. The address map is in the hardware section of the doc.
If not, read on.
First off, try making your data const to get it out of RAM and into program flash:
const uint16_t array[] = { ... };
Since the SPI bus used by the external flash is shared with the TI CC3000 WifFi module, I think you could be having problems related to interrupts and IO for the Wifi interfering with your flash operations. Another user reported that he was able to make his external flash work by turning off the CC3000 interrupts. Note that this SPI is not the same as the user SPI on the Spark core pins.
Try turning off interrupts for the CC3000 around every call that uses the SPI bus to work with the external flash. Note that I would not recommend just turning these off for a long time since you will get data overruns from the cloud part of the firmware. Here’s an example for the writes:
Yes, The array elements are just constants that the program will use. so I think it is better to load it to external flash using dfu-util as I dont have to recompile the program again when I want to change the values of the array later.
Can you the command I should use to load the arrays to external memory with dfu-util? How do I store the array elements in the file, do I have to change uint16_t array to a byte array?
ps. I tried declaring the arraysas a const but the program flash (internal flash) was not big enough to accomodate them.
Another consideration: since you block potentially indefinitely waiting on Serial.available(), both in setup() and on every single call to loop(), you are probably being disconnected from the Cloud due to timeout.
Try returning early from your loop so the Core can talk to the Cloud and call you again in a few milliseconds:
void loop() {
uint8_t _read[4];
/***** Instead of spin-wait, be nice, share. :) *****/
if (Serial.available() == 0)
return;
Serial.print("Value In array = ...");
// ...
@Dave wrote up some nice instructions for using dfu-util to load new keys into the external flash so perhaps he can comment too. The basic command below uses your 0x81000 address from the above code:
dfu-util -d 1d50:607f -a 1 -s 0x00081000 -v -D myfile.bin
Bad things can happen to your core if you mistype here, so be careful!
I am not sure what advice to give on getting your data into a binary file (myfile.bin); there are lots of ways to do that. It sounds like you have a C array initializer like
uint16_t array[] = {0x1234, 0xabcd, ...};
for the 16-bit data. The code you wrote above wants to store LSByte first, so you might have to reshape your array for the binary file so that the binary file was 0x34, 0x12, 0xcd, 0xab,… for the sample array above.