Sqlite inside spark?

Can we store small amount of data inside spark ?
maybe sqlite ?
1-2MB storage ?
and some way to read and write data via the rest api

does any of u guys have this requirement ?

1 Like

There is no database on the core, but you can use the EEPROM from the CC3000 to store little values like your config. If you want to store more data you have to think of a logic that holds your data “in the cloud”.

For Arduino, there is a EEPROM.h library. It schuld be compatible with the SparkCore…

There are 3 potential locations for non-volatile storage on the Spark Core.

The STM32 has 128 KB of flash storage. This is where the firmware goes, which can vary in size a great deal, but if you manage it carefully there will still be room to store your data.

The SST25VF016B external flash module has 2 MB of storage. We’ll be using this to store some data, factory default code, a recent working version of your code to fall back to if something goes wrong, and a copy of the latest firmware update from the server, i.e., the last code you flashed to your Core. That might use about a quarter of the available space, so you will definitely be able to store data here.

Lastly, as @dominikkv says, the CC3000 has EEPROM for its own internal state management, with extra space for user applications. There are about 5 KB of user space available in the final two File IDs: NVMEM_USER_FILE_1_FILEID and NVMEM_USER_FILE_2_FILEID. Docs here: http://processors.wiki.ti.com/index.php/CC3000_EEPROM_user_interface

So, to summarize your approximate available storage options:

  • CC3000 EEPROM: 5 KB
  • External Flash Chip: 1500 KB
  • STM32 Internal Flash: varies depending on firmware

NOTE: The EEPROM.h library won’t work by default because it doesn’t use the CC3000 API, however we’ll create some kind of easy access to non-volatile storage on the Spark Core.

Let me know if you have any questions!

1 Like

Thanks for clarifying this @zachary, you saved me from looking at the code but in the process I found something pretty sweet for github code viewing in monokai color scheme.

Now since the question was about sqlite, the next question is which one of these memory spaces might be good for a real-time database? i.e., which one is the fastest for reading and writing bytes, and most flexible/efficient for just putting a small amount of data in at a time. I’m guessing STM32’s RAM would be the best for that… do you know how much RAM is currently left when you are running a simple program?

Sure thing @BDub. The STM32 has 20 KB of SRAM (volatile memory), the use of which, just like the firmware in flash varies a lot depending on the specific program. We haven’t tried to ascertain how much RAM is available during execution.

As far as speed, I suspect the fastest to slowest list would be:

  1. STM32 RAM
  2. STM32 Flash
  3. External Flash Chip (SPI interface)
  4. CC3000 EEPROM (SPI interface)

As far as flexibility, they’ll probably all feel about the same to the user. We’ll make our own interface to hide some of the complications so these things are easy to use even if one doesn’t understand or want to manage the different kinds of storage.

And just in case it wasn’t perfectly clear, we won’t be supporting SQLite. We will however provide some easy-to-use abstraction for storing data in non-volatile memory.

Awexome, thanks for the info :wink: 20kB of RAM is pretty nice. Even 10kB would be sweet.

Very Interesting information in here. Quick question, will you be able to store any kind of data in the cloud besides the code? If so, what will be the limitation?


Cloud storage is not in our early roadmap, but we’ll keep it in mind as a possibility!

@zachary thanks for the detailed explanation.

This point caught my interest so wrote a quick recursive function allocating block of 128bytes on the stack and printing its memory address. System reboots after stack grew to 10944bytes which is a fair estimate of SRAM available I guess.

iteration 0, stack address 0x20004f64
iteration 1, stack address 0x20004ed4
iteration 2, stack address 0x20004e44

iteration 74, stack address 0x200025c4
iteration 75, stack address 0x20002534


Any sample code to store, retrieve and delete data in memory. I would like to store sensor data (Example: date + time, temperature) and read the data once in 4 hours, push data to cloud and delete the read data from memory.

Also any suggestion on format to store the data (tag database or operation historian), http://en.wikipedia.org/wiki/Operational_historian.

Thanks for the temperature example, I am testing formulas for different sensors and will share once validated with Spark.

We are working on a library to write to external flash (1.5MB available), it should be available in the next couple of weeks.


Thanks, we can store more raw data with 1.5 MB and great for logs too.

Storing some data in non-volatile space is needed to save some states during software-resets i.e. due to loosing connection to the cloud. :wink: