Local development and gdb debugging with NetBeans, a step by step guide

I had to set up my environment for Spark development and debugging at home (already had it set up at work), so I figured I would keep a log here and end up with a step by step guide.

(optional): Install SourceTree
I use SourceTree for version control. It’s a git GUI. In this guide, we’ll use it to check out the spark-firmware for custom development.

You can download it here. It’s free, it’s awsome.

Fork the spark firmware repository
If you are going to develop your own app and possibly contribute to the spark firmware, you probably want to fork the spark repository. You can easily do this on Github. You can then commit your code to your own fork, and send a pull request if you have coded something that is worth pushing upstream.

Clone your fork to your computer
In SourceTree, click the ‘Clone/New’ button.
Click the globe on the right. It will probably ask you for your GitHub credentials.

When you click OK, it will show you a list of all your repositories on GitHub. Pick the Spark firmware of which you are the owner and click OK.

Under advanced settings, set the branch to check out to feature/hal
This branch is easiest to work with for now and will later replace the master branch.
Optionally you can set the clone depth to prevent cloning a lot of old commits. Set it to 100 to only get the last 100 for example.

Set the destination path, check ‘Bookmark this repository’ and click ‘Clone’.

You now have a local copy of the Spark firmware, with your own GitHub account set up as remote.

Add the official spark repo as a remote
In the top right in SourceTree, click the settings button. It will open this window:

Add the official firmware repo as a remote with the name ‘spark’.

Get the latest updates from Spark
You can now click the ‘Fetch’ button to get the latest commits from the official repository. The fetch button does not change your code, it only updates your view of the remotes. If you click the ‘Pull’ button it will also try to merge all changes upstream. I often prefer to use the fetch button and right click any commit I would like pull into my local branch.

After the fetch, I can see that a lot happened in the official repo. I can see different branches and new commits on the ‘Log/History’ tab (ctrl-2).

I can also click any 2 commits to see the code changes between them.

So I am behind the official repo and now I would like to merge the latest code changes.

I am on branch feature/hal, my own remote (origin) is at the same commit (see screenshot above) and the official repo is a bit further (spark/feature/hal).

I can now right-click that commit and choose ‘merge’.

If you start developing your own code, it is better to create a branch for that. Just click the branch button and choose a new name. You can still pull in all the changes from the feature/hal branch to stay up to date.

Allright, so now that we have version control ready to go, we can continue with the development environment. I will do that in my next post.


I will be repeating a lot of stuff from this video, but I find a text based guide easier to follow myself. Please refer to the video for details. It was my main source for the bunch of links below. The video is also made for the master branch. The feature/hal branch is easier to use (and the future), so my instructions are based on that.

Download and install the GNU Tools for ARM Embedded Processors
Download here. I chose the windows installer. During install, choose English. It will be easier to find help if you get compile errors.

On the last screen, check ‘add path to environment variable’

**Install MinGW **
Download here: http://sourceforge.net/projects/mingw/files/
GnuWin32 make didn’t work well for me.

Install DFU-Util
Go to http://dfu-util.gnumonks.org/releases/
Pick the latest release binaries.7z
Extract the contents of the win32-ming32 directory to somewhere you can find it.

Install Zadig
Go to http://zadig.akeo.ie/
Install Zadig
Connect your Spark Core via USB and run Zadig to replace the driver.

Add DFU-util and the ARM toolchain to your PATH variable
I prefer to do this with Rapid Environment Editor
You will have to start it as administrator to change system variables. If that is not an option for you, add to your user PATH variable. (Restart ad administrator is available from the File menu).
In Rapid Environment Editor, click the PATH variable and choose ‘add value’, then ‘add directory’. You will want to add 3 directories:

  • The directory where you placed dfu-util.exe.
  • The directory where you find arm-none-eabi-g++.exe (GNU Tools ARM Embedded\4.8 2014q3\bin)
  • The directory where you find make.exe from MinGW msys (C:\MinGW\msys\1.0\bin)

May I recommend Everything to find files on your computer blazingly fast?

Test your PATH setting by starting a NEW command prompt and typing in:

This should all be found and give some output.
If you want to check it uses the executable from the right path, use ‘where make’, which is equivalent to Linux ‘which’ (on Windows 7).

Recompile the firmware for debugging
Go to where you cloned the spark firmware and start a command prompt there (from explorer, shift + right click --> open command window here).

Run make:

If you see that, great, the code compiles.

If you start make from the directory ./main, You can flash it to your core immediately after building with:

make program-dfu

Don’t forget to put your core into DFU mode by holding the mode button and pressing reset, until he LED flashes yellow.

Hardware Debugging with ST-Link V2
For the following steps, you will need a hardware debugger. If you do not have one, you can still build locally and upload via DFU.

Compile the firmware again with debugging enabled by adding USE_JTAG_SWD=y to the command:

I have sent in a pull request for SWD debugging, without JTAG
If you do not want to use JTAG, because you use the JTAG pins for something else, you can use single wire debug, which only uses 1 data and 1 clock wire (D6 and D7).

If you pull in the changes from my feature/SWD branch, you can use the command line option USE_SWD=y to use SWD without enabling JTAG. st-util will work just the same.

More info on making a custom SWD shield here:

Once you have compiled your firmware with debugging enabled, use dfu-util to upload it to your core.

make clean all USE_SWD_JTAG=y

Then upload the hex file to your core with dfu-util:

dfu-util -d 1d50:607f -a 0 -s 0x08005000:leave -D build/target/main/prod-0/main.bin

Connect to your core with the ST-LINK
First install the device drivers by installing STM32 ST-LINK utility
Then start ST-LINK utility and connect to target. If it doesn’t work, try to connect under reset.
If you can connect under reset, but not when it is running, you have not properly compiled the firmware with debugging enabled.

Start a GDB server with st-util
To connect your IDE (NetBeans) to the debugger, you need to run a GDB server. We will use texane st-util for that.
You can find a precompiled version here: http://www.emb4fun.de/archive/stlink/index.html

Alternatively, you can build it from source to get the latest version.

So let’s test st-util and run it on port 9025:

st-util -p 9025

Make sure you have closed STM32 ST-LINK utility first.
st-util should find your core and report a Chip ID and Core ID.

Final step is to set up NetBeans to compile and debug, which will follow in my next post.


Install Netbeans
Okay, now it is finally time for NetBeans.
Download it here: https://netbeans.org/downloads/

Choose the C/C++ version. You might have to install the Java SDK.

Install Cygwin
NetBeans needs Cygwin to compile your code. Install the 64 bit version if you are on a 64 bit machine.
Download it here: https://cygwin.com/install.html
Just install the default packages (you don’t need to select them, just click next twice).

Install NetBeans Plugins
Start NetBeans, go to Toos–>Plugins–>Available Plugins

Install the Gdbserver plugin, the C/C++ plugin should be already installed.

Set up your ARM toolchain in NetBeans
Tools–>Options–>Build Tools–>C/C++
Add a new tool collection and set up the following tools:

And set up the build tools, in particular be sure to set the debugger to arm-none-eabi-gdb from your arm gcc install:

Create a new project from existing sources
File --> new project --> from existing sources

Select your firmware directory, leave configuration at automatic, choose your ARM tool collection

MinGW make didn’t work with an absolute path.
This bug is worked around by just setting the make command to make.exe (which works because it is in the PATH)

Now the build works :smiley:

Programming the core via the ST-LINK V2
I have not find a way to program the core and attach the debugger in one go, but I have found a good workaround:

We will set up the run button to program the core and we will use the ‘attach debugger’ button to start debugging afterwards.

But first of all, we have to make sure that the new firmware we are going to flash to to the core also has debugging enabled. Add USE_SWD_JTAG=y to the make command.

If you forget this, you will have to reflash with dfu:

Make a new file with a few gdb commands to upload code to the core:

# connect remote gdbserver on port 9025 (started with st-util -p 9025)
target remote localhost:9025

# reset core and hold in reset
monitor reset halt

# program core with supplied .elf file

# jump to spark core reset handler
jump Reset_Handler

# remove all breakpoints

# continue execution (TODO: continue in background so we can disconnect while running continues)

# disconnect from remote. This pauses execution. Detach is not supported.


So add that to a text file and save the file as gdbprogram.txt somewhere outside of your repository.
This will program the core via the ST-Link V2 and starts running the program. I have not found a way to detach from the debugger after programming without pausing execution. So you’ll have to exit with CTRL-C in the output window in Netbeans.

Now right click your project and open properties.
On the ‘run’ tab, set up your run command as (adapt for your paths) and check ‘build first’.

Run command:

arm-none-eabi-gdb main.elf -x c:\repos\gdbprogram.txt

Run directory (path to main.elf):



Now hit the run button. It should compile the code and send it to the spark via the GDB server via st-util:

As you see, st-util exited after programming. That’s where this .bat file comes in handy:


st-util.exe -p 9025

goto loop

It will keep restarting st-link until stopped with CTRL-C.

Finally: attaching the debugger

For the gdbserver plugin to work, we will have to set the build result under the make settings to point to our .elf file.
We will test debugging with the Tinker app. Change your make command and build result to compile with the tinker app:

And your run settings:

Please someone find a way to do this automatically!

Now compile and flash again with the run button, and then attach the debugger:

Voila! Programming and debugging in Netbeans with the click of a single button!

BTW, blink is a bad test program for this. The little blue LED next to the USB connector is connected to one of the JTAG pins and will not blink.

Make sure you have set a break point in your program before running, because the pause button does not work. Once you have hit a break point, you can step through the program or run from break point to break point.

# set limits of the hardware
set remote hardware-watchpoint-limit 6
set remote hardware-breakpoint-limit 4

# jump to spark core reset handler
jump Reset_Handler


Save it as gdbattach.txt and load it in the debug settings:

This will reset your program when you attach the debugger.

And as you can see below, the debugger stopped at the break point :smile:

Issues I have encountered
Once your core has breakpoints set, it will not run freely, even when no debugger is attached.

Make sure your spark core is set up on your WiFi first. I brought a spark home from work and it hung on flashing green. It was a bitch to get it back on the network with the custom debugging builds running and the debugger attached.


@Elco - this is AWESOME!!! Do you mind if we set up a link to this in our Help Center? It would be great to have it there, and I know the community needs this guide. Let me know!


1 Like

Of course, no problem. I’ll try to keep it up to date based on the feedback I get.

1 Like

I have found better command files for programming and attaching.
The trick was to add a jump to the spark core reset handler.

The new gdbprogram.txt will now program the core and immediately boot the spark core (green flashing, connect to WiFi, start running program).

# connect remote gdbserver on port 9025 (started with st-util -p 9025)
target remote localhost:9025

# reset core and hold in reset
monitor reset halt

# program core with supplied .elf file

# jump to spark core reset handler
jump Reset_Handler

# remove all breakpoints

# continue execution (TODO: continue in background so we can disconnect while running continues)

# disconnect from remote. This pauses execution. Detach is not supported.


The gdbattach.txt has changed to:

# set limits of the hardware
set remote hardware-watchpoint-limit 6
set remote hardware-breakpoint-limit 4

# jump to spark core reset handler
jump Reset_Handler


This is all very quick, single push of play builds the code, programs it and starts execution. No need to press buttons for the bootloader. You can optionally attach the debugger afterwards for those harder to debug issues.

Also make sure you use commit d6df0c94f4c629e28102e8b2470c0d9ff2c230c3 of feature/hal
Older commits had a linker bug that caused the core to get stuck in button init and never boot to user code. The RGB LED stayed off and the core seemed unresponsive. Mdma identified the bug for me and fixed it last night.


I finally made some progress with my code, and decided to shoot a quick demo video while debugging:


I keep clicking the :heart: and it’s only registering +1!! Arghh…

This is SUPER helpful @Elco thank you for putting this rockin’ tutorial and video together!

1 Like

Thanks for the love :smile:

I hope together we can find the ultimate gdbinit files and debug methods.

One quick tip: when debugging and accessing on chip variables, it only reads the first variable of (multidimensional) arrays. You can use this syntax in watches to show the first 5 elements of char * buf:

1 Like

Another quick tip:

Use SYSTEM_MODE(SEMI_AUTOMATIC); to skip connecting to cloud at start up and boot to user code immediately.
This will skip the green blinking part, so testing is much faster.

Great tutorial!

I have run into issues, where i get ST-LINK to program spark:

2014-11-16T19:56:49 INFO src/stlink-common.c: Starting verification of write complete
2014-11-16T19:56:49 INFO src/stlink-common.c: Flash written and verified! jolly good!

But ST-LINK won’t quit, it just hangs.

The gdb command window shows:

c:\Spark\firmware>arm-none-eabi-gdb .\build\target\main\prod-0\main.elf -x c:\SparkCorePrograms\gdbprogram.txt

GNU gdb (GNU Tools for ARM Embedded Processors)
Copyright (C) 2013 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.  Type "show copying"
and "show warranty" for details.
This GDB was configured as "--host=i686-w64-mingw32 --target=arm-none-eabi".
For bug reporting instructions, please see:
Reading symbols from c:\Spark\firmware\build\target\main\prod-0\main.elf...done.
0x0800010c in ?? ()
Loading section .isr_vector, size 0x10c lma 0x8005000
Loading section .text, size 0x11d3c lma 0x8005110
Loading section .init_array, size 0x30 lma 0x8016e4c
Loading section .data, size 0x40c lma 0x8016e7c
Start address 0x8005110, load size 74372
Transfer rate: 4 KB/sec, 9296 bytes/write.

And it also hangs. ST programmer just flashes alternatig green and red. Spark Core’s main LED is off. I have
synced to the latest changes via “git pull” on feature/hal branch - I cannot see the commit d6df0c94f4c629e28102e8b2470c0d9ff2c230c3 in feature/hal that you mention however, so maybe I’m hitting that bug?

When I press CTRL-C in gdb, this appears:

Program received signal SIGTRAP, Trace/breakpoint trap.
WWDG_IRQHandler () at ../build/arm/startup/startup_stm32f10x_md.S:115
115             b       Infinite_Loop


The latest hal commit should be good now, mdma fixed the bug.

I have also noticed that the application won’t start sometimes with the debugger attached. Can you try attaching the debugger?

My gdbprogram looks like this:

# connect remote gdbserver on port 9025 (started with st-util -p 9025)
target remote localhost:9025

# reset core and hold in reset
monitor reset halt

# program core with supplied .elf file

# jump to spark core reset handler
jump Reset_Handler

# remove all breakpoints

# continue execution (TODO: continue in background so we can disconnect while running continues)

# disconnect from remote. This pauses execution. Detach is not supported.


But I have to force quit it with ctrl-c, it hangs on ‘continue’.

So you have to pay attention that you do not have old run sessions open when starting a new one (run or debug)

gdbattach wouldn’t work, only after i would add:

target remote localhost:9025

it would connect, but then hang. It hangs in continue for me as well. After i press CTRL-C 2x I get gdb shell, and I can issue commands (here where):

Start address 0x8005110, load size 88692
Transfer rate: 4 KB/sec, 9854 bytes/write.

Program received signal SIGTRAP, Trace/breakpoint trap.
WWDG_IRQHandler () at ../build/arm/startup/startup_stm32f10x_md.S:115
115             b       Infinite_Loop

Program received signal SIGTRAP, Trace/breakpoint trap.
WWDG_IRQHandler () at ../build/arm/startup/startup_stm32f10x_md.S:115
115             b       Infinite_Loop
(gdb) where
#0  WWDG_IRQHandler () at ../build/arm/startup/startup_stm32f10x_md.S:115
#1  <signal handler called>
#2  0x08010e2e in GPIO_Init (GPIOx=0x0, GPIO_InitStruct=GPIO_InitStruct@entry=0x20004fdc)
    at MCU/STM32F1xx/STM32_StdPeriph_Driver/src/stm32f10x_gpio.c:258
#3  0x0800fad0 in LED_Init (Led=Led@entry=LED1) at MCU/STM32F1xx/SPARK_Firmware_Driver/src/hw_config.c:393
#4  0x080101ae in Set_System () at MCU/STM32F1xx/SPARK_Firmware_Driver/src/hw_config.c:159
#5  0x0800ea3a in HAL_Core_Config () at src/core-v1/core_hal.c:81
#6  0x0800843e in LoopFillZerobss () at ../build/arm/startup/spark_init.S:3
#7  0x0800843e in LoopFillZerobss () at ../build/arm/startup/spark_init.S:3
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

I set a breakpoint on setup function as follows:

break setup

But then it hangs again.

What do you mean by hanging?
When it is free running in continue, it will not respond to pause. It will seem unresponsive until you hit a breakpoint again.
The fact that pause doesn’t work sucks, but I have not found a way to fix it yet.

When debugging you just have to make sure you will hit a breakpoint again, or step through the program.

Im able to issue following commands:

target remote localhost:9025
monitor reset halt

then I try to do:
jump Reset_Handler

and then it never returns gdb shell again. Thats what i incorrectly described as “hang”.

(gdb) jump Reset_Handler
Continuing at 0x80081ca.

I can interrupt by pressing CTRL+C:

Program received signal SIGTRAP, Trace/breakpoint trap.
WWDG_IRQHandler () at ../build/arm/startup/startup_stm32f10x_md.S:115
115             b       Infinite_Loop  

Stopping in this infinite loop. Then I issue jump Reset_Handler again and only then i hit app_setup_and_loop breakpoint.

  • So thats one odd thing, why doesn’t it end up in that breakpoint on the first jump Reset_Handler/continue combination?

It is not reliable either. Sometimes I get stuck in app_setup_and_loop () and won’t get past this check:

if(!SPARK_FLASH_UPDATE && !HAL_watchdog_reset_flagged())

Where HAL_watchdog_reset_flagged() returns 1. But if I modify the value this function returns:

set variable IWDG_SYSTEM_RESET = 0

I can step past this into setup routine.

  • What is HAL_watchdog_reset_flagged() about and why does it sometimes return 1 and sometimes 0?

When in setup code, i can run all the lines until i hit:


This seems to hang gdb, the timer seems to not be updated in function HAL_Timer_Get_Milli_Seconds(void). But again, I can get past this by issuing:

set variable system_1ms_tick = 10
  • Why is the variable system_1ms_tick not being updated? Are interrupts disabled or something when Spark core runs under gbd by default?

Guys @mdma, @AndyW, @wtfuzz or @BDub - could you shed some light? Im trying OpenOCD as well here same issues (apart from HAL_Timer_Get_Milli_Seconds(void)):

I think I have a fix! Branch feature/hal seems problematic, switching on the master branch, modifying firmware to disable JTAG and enable SWD makes it work! Hooray. Also COM port is working, which I had issues with before.

Im using OpenOCD 0.8.0 instead of st-link util (way faster upload) and updated my STLINK debugger to version: V2.J23.S4 STM32+STM8 Debugger. This is the configuration file that im using to start OpenOCD:

source [find interface/stlink-v2.cfg]
source [find target/stm32f1x_stlink.cfg]
reset_config srst_only srst_nogate

I copy this into /scripts/board directory as spark_core.cfg. Compiled OpenOCD binaries can be found here:


Im running x64 Windows 8.1, so i run 64 bit version of openocd:

openocd-x64-0.8.0 -f board/spark_core.cfg

(Just start this instead of st-util)

Maybe I could commit the change? How do I do that, provided that I have the change comited in my local branch?

Sorry to hear you’re having trouble with feature/hal branch - it is a development branch so there may be unresolved issues, but I and a few other developers are using it successfully. If there’s any way we can dig in and find out more about the cause, I’d be happy to help.

To make your change to the master branch available to others, you submit a pull request. Please see https://help.github.com/articles/creating-a-pull-request/

One thing that’s not immediately clear is that the PR should be in it’s own branch, which you then push to your github repo. From there, you can create the pull request.

The first time you make a PR it can seem a bit daunting, but is much easier the second time. Also tools like SourceTree make the process much easier.

You don’t write to the spark repository directly so there’s no chance of doing anything harmful.

Best of luck, I hope that helps! :slight_smile:

Thanks for the AWESOME tutorial!

I followed it and had no real problems. But now I have run into something that I have been unable to figure out with the linking of the project. Where are the libraries specified that need to be linked into the final binary output? I’m trying to utilize functions in <time.h> but it’s failing at the link stage because it can’t find the functions I’ve used.

Being completely unfamiliar with how the Spark project is laid out and how the ARM GCC toolset works I was unable to find where this is done.

If you use the HAL branch, it all taken care of by the makefile. I do not have any experience with compiling the master branch, so I hope someone else can help you out.


Can you share your schematic for your debugging board? I finally have my ST-Link/V2 but I don’t know what to connect where. Thanks!

I tried downloading the schematic from the Spark Git repo but the version of Eagle that I have complains that the schematic is not a valid Eagle file.