Porting Spark firmware to STM32F4

I just discovered Spark today and have been planning on using a STM32F4 processor in an internet connected device for my senior design project this fall. I chose the STM32F4 due to the raw power and low cost of the STM32F4 discovery board (plus I already had one).

I was wondering how difficult it would be for me to port the Spark firmware to the STM32F4? Would all the changes take place in core-common-lib and core-firmware? I would think this would be an achievable goal being that they are both STM32 processors.


I suspect the firmware port will be very easy, assuming you have the same Wi-Fi module and SPI flash chip. Redesigning the circuit board for another MCU footprint would likely be quite a bit harder.

Go for it! :+1:

1 Like

I was hoping to hear that it should be pretty do-able. You saying it will be easy tells me that it will be “do-able with some struggling along the way”. Fine by me.

I’m less worried about the hardware design. My original plan was to design the electronics based around the STM32F4 using the discovery board schematic as a reference/template. Now it looks like I’ll just be using the STM32F4-Discovery board AND the Spark Core as a template. I already ordered the SPI 2MB Flash IC, I’ll probably pick up Adafruit’s CC3000 breakout board. I think keeping the amount of variables changed at first to a minimum is the best approach.

It looks like my modifications will mostly be on stuff in the core-common-lib repo. I think my current plan of attack is

  1. Get current spark core firmware compiling successfully on my system.
  2. Pull down a copy of the core-common-lib repository.
  3. Start modifying.

Hopefully this will turn out to be a success. I could see it being useful for others than just me I would think.

As a side note, the spark core firmware has limits on things, like 10 cloud variables for example. I can obviously see where this limit is set in the source code, is this due to RAM limitations? My concern is that my project will probably end up needing more than 10, and I’m hoping the STM32F4 could raise some of these current limitations.

Thanks for the help!

Just to check in and give an update on this. I have been working on this lately. I’ve definitely been putting more hours into it since I’m back at school this past week and the Fall semester (and my senior design course) is underway.

Here are links to my git repositories where I forked the Spark repositories:

As I suspected, most of the work has been in core-common-lib so far. I have been doing my modifications in the stm32f4_port branch.

Now once I get this nearly complete and it actually compiles, I’m not quite sure how I’m going to load it onto my STM32F4 discovery. I assume through dfu-util? I have the SST flash chip and CC3000 wifi module that will be connected to the STM32F4 discovery board to the pins I have modified the code to reflect in core-common-lib. Is there anything special I need to do to these two devices to make them ready? I’ll admit I haven’t done tons of research into finding answers to the above questions yet, still working on porting code. But I expect the next steps after a successful compile won’t be glaringly obvious when I get there.

Please feel free to provide constructive criticism if you would like. I do not claim to be an expert on any of this, just an engineering student trying to modify some code. I’ll continue to post updates here on my progress/struggles.

Most of the ST discovery boards have an onboard ST-Link programmer, so you can OpenOCD to load and debug your firmware during your port.

Squawk here if your google-fu is weak on this.

OpenOCD is a funny beast, but very powerful once you have learned it’s weird ways.

Once you get as far are talking to the cloud, you’ll need to decide whether you want to run a local cloud (in which case you issue your own encryption keys & device IDs), or whether you want to try and talk to the Spark cloud. In which case it would have to be taught about your DeviceID (each core has a unique identifier, that the Spark folks learn during their manufacturing phase) yours will be unknown to them.

Good luck.

1 Like

Thanks for the encouragement. There’s so many utilities involved with this that I tend to get them mixed up. How does OpenOCD differ from Texane’s ST-link utilities? I’ve been using the st-flash utility up until this point to flash my simple binaries.

So is the only thing I need to do is flash the compiled binary to the STM32’s flash? I was worried that I needed to do something to prep external SST 2MB flash chip or the TI C3000 wifi module as well as flashing the STM32F4.

1 Like

The existing spark firmware stores keys and factory reset firmware in external flash, I don’t know if there is a simple doc with the memory map of both internal and external flash. If anyone knows of one, please speak up.

I don’t believe there is any format requirement for the external flash, just the expectation that certain things reside in different places.

@AndyW, the mapping is found here: http://docs.spark.io/hardware/#spark-core-datasheet-memory-mapping

1 Like

Excellent - thanks for the pointer, @kennethlimcp.

@lanteau, let us know if you have any follow-up questions.

I actually found that memory mapping just after my replies last night, that’s definitely helpful information. I see that the bootloader resides at 0x08000000. This is where I am used to flashing my bare-metal applications to the STM32F4. So I assume this is where the MCU begins executing when it boots?

Would it be reasonable to focus on getting the bootloader onto the STM32F4 first, then I could use dfu-util to flash the firmware like a normal Spark Core? If this is a decent plan, it looks like I need to start look at https://github.com/spark/bootloader and get that running first. The README there seems to be geared towards people who built their Core from scratch, which kinda fits me.

Yes - it would be wise to start with the bootloader, since it sets up the enviroment that the firmware expects to inherit.

Making more progress…the bootloader doesn’t look like it many changes from my first read through. However the bootloader gets linked against core-common-lib (this makes sense). So I’m back at making changes in core-common-lib.

One thing that I’ve been trying to figure out is how is the RTC used in the firmware? The STM32F4 RTC is quite a bit different, so I think I need a good understanding of how the Spark firmware is using it to make changes.

The firmware is using both the RTC_IRQn and the RTCAlarm_IRQn. RTC_IRQn on the STM32F1 is described as the “RTC Global Interrupt”. This interrupt triggers RTC_IRQHandler in stm32_it.cpp in the core-firmware. What is the purpose of this interrupt handler? How is the “RTC Global Interrupt” triggered on the STM32F1?

The STM32F4 does not have the global interrupt, it has the RTCAlarm interrupt and a wakeup interrupt. The documentation for both the F1 and F4 do not seem to come right out and answer my questions (in my opinion).

1 Like

@satishgn may be able to provide some insight here. Satish, any experience with these different styles of RTCs? Do you know off the top of your head what @lanteau might have to modify in the core-common-lib?

1 Like

Hey @lanteau, after going through this thread, got a couple of suggestions for you.
ST provides a separate firmware library for STM32F4 devices, so you need to replace/update the following in core-common-lib after downloading the F4 specific drivers “STM32F4-Discovery_FW_Vx.x.x” from ST website(stm32f4discovery_fw.zip) :
1)STM32F10x_StdPeriph_Driver => STM32F4xx_StdPeriph_Driver
2)STM32_USB-FS-Device_Driver => STM32_USB_Device_Library
3)SPARK_Firmware_Driver => There’s some work here to do eg.
enable STM32F4XX define, use stm32f4xx.h header files instead of f1, Update device drivers to make it F4 compatible such as updating RCC, DMA etc. specific code.

There is a migration compatibility guide over here: http://www.st.com/st-web-ui/static/active/en/resource/technical/document/application_note/DM00024853.pdf
Check the following section: “STM32 peripheral compatibility analysis F1 versus F4 series”

As far as RTC peripheral is concerned, the RTC on STM32F4 is a new peripheral providing sub-seconds operation, pinout is identical for the same feature but SW is incompatible so this would involve some new work on your discovery board. On Core, RTC Interrupt is used for general time keeping used by TimeClass whereas RTCAlarm interrupt is used for Spark.sleep(low power mode wakeup)

I got a STM32F429IDISCOVERY board in my possession but don’t have time to start a porting work. If you put some daily effort, It shouldn’t take more than a week to port the entire core-firmware from F1 to F4.

All the Best!


@satishgn Thanks for the reply. I’ve already done 90% of the porting work in SPARK_Firmware_Driver. As you said, most of it was switching the the header files over to the STM32F4 drivers and then working through the differences in GPIO/RCC/DMA, etc. All of my work so far is in the stm32f4_port branch of my forked repositories on GitHub, I provided the links above.

The differences in the RTCs has been a sticking point so far just for the fact I need to really understand how the Spark Core is using it’s RTC to replicate the functionality on the completely different STM32F4 RTC device.

I also think USB is going to take a lot of work. At first glance it seems like I am going to have to go through all the DFU bootloader code to make it work with the STM32F4’s different USB device. This seems a lot more complicated than much of the porting work so far. Maybe, I’m exaggerating however. I should know more tonight when I start digging into it more. I at least got core-common-lib successfully building with the STM32F4 device libraries now…that’s a milestone. Stay tuned.

1 Like

More progress…I basically removed all the existing Spark Core USB driver code and pulled in the code from “STM32F105/7, STM32F2 and STM32F4 USB on-the-go Host and device library (UM1021)”. They have a USB DFU example that was extremely similar to the Spark Core bootloader so I was able to get it working based on that example.

Since the STM32F4’s FLASH management is different than the F1 (it goes by sectors, not pages), I reworked the memory mapping a bit to line up with the STM32F4’s sectors. My memory map for the F4 is as follows:
0x08000000 Bootloader (32K max - Sectors 0 and 1)
0x08008000 System Flags (16K max - Sector 2)
0x0800C000 Firmware (976K max Sectors 3-11)

I was able to create a separate “Blinky” binary and flashed it to 0x0800C000. After doing things like relocating the vector table and updating the linker script, I was able to force the bootloader to load my Blinky app. If I let the bootloader enter DFU mode, it DOES show up as a device in my “dmesg” output. I haven’t done much testing with dfu-util with it yet.

I did some commenting out of the Spark RGB led code since the STM32F4 Discovery doesn’t have a RGB LED. I changed the code to use the on board LEDs for indications. I’ll probably end up with a RGB LED in my final design, I see it’s kind of handy.

I need to verify that my STM32F4 Discovery is working properly with the external flash chip over SPI. That is one of my next tasks. Then I need to work on porting the main firmware over. Luckily, I think the further I get away from the hardware in the code, the less modifications there will need to be. Stay tuned for updates.

My bootloader repo: https://github.com/lanteau/bootloader


A little more progress. As far as I can tell, the SPI communication with the SST25VF flash chip is working now. The SPI Init function for it needed some updates for the changes in the STM32F4 Std Periph library.

I was banging my head against the wall for a long time on why my SPI wasn’t working when the bootloader tried to perform a JEDEC ID read from the flash chip. The MISO line had a bit of a waveform, but only around 1.7V (a far cry from Vdd - 0.2V). About 1 out of every 10 tries it would work however. I finally decided to open the Spark Core schematic and noticed the Core connects the Write Protect (WP#) and Hold (HOLD#) pins to Vdd. After doing the same on my breadboard I was able to crank my SPI back up to the 10-20MHz range and the JEDEC ID read worked everytime. I used dfu-util to flash to the cloud public key to the flash chip and watched the SPI on my oscilloscope. I see tons of activity, the initial bytes gave a write command with the correct address (0x01000), so I assume the write was performed successfully.

One problem I am noticing however is with jumping to my temporary “blinky” app I have placed at 0x0800C000 (my CORE_FW_ADDRESS). The bootloader will jump there, the blinky app executes, but after around 20 seconds every time, the blinking “circle” stops, the red LED begins blinking fast, and 10 seconds later the STM32 appears to reset. Once it resets when it tries to dereference 0x0800C000, the result is 0xFFFFFFFF. If I’m correct, 0x0800C000 should be the _estack entry in the vector table from my blinky app. It’s like it’s getting corrupted/erased. Any idea what could be happening here? Thanks for the help.

1 Like

Solved the above problem. I was not resetting the watchdog timer in my “blinky” app. The watchdog would reset the processor and it would try to copy the firmware from the “backup” location. Since I have not placed anything in the backup location the SST25 chip, this would copy blank data over my blinky app. :blush:

I have now placed variations of my blinky binary in both the “Factory Reset Firmware Location” and “BackUp Firmware Location” and have verified that on the first watchdog reset, the backup firmware is successfully copied to the on-chip flash. On the second watchdog reset, the factor reset firmware is copied to the on-chip flash. Finally, the third factory reset erased the on-chip flash and went into DFU mode. I’m quite pleased with these results :smiley:

I think I’m finally ready to get the “core-firmware” running on it. Hopefully it shouldn’t require as many modifications as the lower level code did. I guess I could see the CC3000 module being a source of some headaches now. Stay tuned…

1 Like

Quite a bit of progress. I’ve hacked out a ton of the spark wiring code from core-firmware to get the basics running first. I’ll worry about adding the stuff I pulled out later.

After quite a bit of struggling, I have my “STM32F4 Spark” connected to my Wi-Fi AP, and trying to connect to my personal “spark-server”. How do I set the the ID of my created device? My spark-server is saying:

Connection from:, connId: 3
Expected to find public key for core ffffffffffffffffffffffff

So clearly the ID is getting messed up. I see ID1 defines in the code, but it tries to do a 12 byte memcpy of that into device_id, which doesn’t seem to make sense to me. I believe this is all @zachary 's code, hopefully he can shed some light for me. My fast blinking cyan light is the light at the end of the tunnel for me! Thanks!

1 Like

You might be able to arbitrarily define an ID for now or see how to read the device id which is the id of the stm32f4

1 Like