A hard question for the hard core compiler experts

I’m trying to get the core firmware to build under two IDE’s Coocox CoIDE, and Netbeans. If the solution hits you right off please let me know, otherwise don’t waste any time on this.

I’m running into a similar problem on both IDE’s. Here is one error:

When compiling cc3000_spi.c I get an error on line 256 indicating that DMA_IT_TC has not been defined. Netbeans gives a warning, keeps going, and completes the build. CoIDE gives an error and stops.

DMA_IT_TC is defined in file stm32f10x_dma.h.

I do NOT get any errors in either IDE indicating that they can’t find an include file, and I have checked and triple checked to make sure all the include paths are there and correct. I’ve even checked them against the -I paths you get when you running from command prompt (where no errors occur).

It acts like stm32f10x_dma.h is not in the include tree at all for file cc3000_spi.c.

Lastly, the .bin file created with the faulting Netbeans build is identical to that created with the command prompt build (in other words, presumably correct). The symptoms contradict each other but that’s the way it is.

Netbeans simply says: “Unable to resolve identifier DMA_IT_TC”.

CoIDE spews out the following:

[cc] arm-none-eabi-gcc -mcpu=cortex-m3 -mthumb -Wall -ffunction-sections -g -O0 -c -DSTM32F103CB -DSTM32F10X_MD -DSUPPORT_CPLUSPLUS -IC:\sparkbensrest\core-common-lib\stm32f10x_stdperiph_driver\inc -IC:\SparkBensREST -IC:\sparkbensrest\core-common-lib\cmsis\device\st\stm32f10x\include -IC:\CooCox\CoIDE\workspace\SparkBenRest1 -IC:\sparkbensrest\core-common-lib\stm32_usb-fs-device_driver\inc -IC:\sparkbensrest\core-common-lib\cc3000_host_driver -IC:\sparkbensrest\core-common-lib\spark_firmware_driver\inc -IC:\sparkbensrest\core-common-lib\cmsis\include -IC:\SparkBensREST\core-common-lib C:\SparkBensREST\core-common-lib\SPARK_Firmware_Driver\src\cc3000_spi.c C:\SparkBensREST\core-common-lib\SPARK_Firmware_Driver\src\hw_config.c C:\SparkBensREST\core-common-lib\SPARK_Firmware_Driver\src\sst25vf_spi.c
       [cc] In file included from C:\sparkbensrest\core-common-lib\spark_firmware_driver\inc/hw_config.h:33:0,
       [cc]                  from C:\sparkbensrest\core-common-lib\spark_firmware_driver\inc/cc3000_spi.h:33,
       [cc]                  from C:\SparkBensREST\core-common-lib\SPARK_Firmware_Driver\src\cc3000_spi.c:34:
       [cc] C:\sparkbensrest\core-common-lib\spark_firmware_driver\inc/config.h:12:2: warning: #warning “Defaulting to Release Build” [-Wcpp]
       [cc]  #warning  “Defaulting to Release Build”
       [cc]   ^
       [cc] C:\SparkBensREST\core-common-lib\SPARK_Firmware_Driver\src\cc3000_spi.c: In function ‘SpiIO’:
       [cc] C:\SparkBensREST\core-common-lib\SPARK_Firmware_Driver\src\cc3000_spi.c:256:2: warning: implicit declaration of function ‘DMA_ITConfig’ [-Wimplicit-function-declaration]
       [cc]   DMA_ITConfig(CC3000_SPI_TX_DMA_CHANNEL, DMA_IT_TC, ENABLE);
       [cc]   ^
       [cc] C:\SparkBensREST\core-common-lib\SPARK_Firmware_Driver\src\cc3000_spi.c:256:42: error: ‘DMA_IT_TC’ undeclared (first use in this function)
       [cc]   DMA_ITConfig(CC3000_SPI_TX_DMA_CHANNEL, DMA_IT_TC, ENABLE);
       [cc]                                           ^

In the platform_config.h I have (circa Jan 2014) all the used std_periph_lib devices that are used have includes. In the version I downloaded a week ago, they are all removed. Seeing as stm32f10x_conf.h is now in the inc directory, and have all those std_periph_lib includes in it, I’m guessing the intent was to replace the old with that. However for the life of me, I can’t find an include for it anywhere.

Why don’t you try inserting after line 41 (#incude “stm32f10x,h”), #include “stm32f10x_conf.h” and see what happens.

I think CoIDE will turn out to be the tool you are looking for; the ST_Link works really well, as you stated in your other post. Just need to persevere with it

1 Like

Yes, that worked for that particular instance, however (as kind of indicated in original post) there are lots of other instances where things that should be defined aren’t. And it’s not that the compiler can’t find the .h files, they simply aren’t being asked for.

I don’t understand why it apparently works using make without errors.

OK there are some defines (cflags) specified in the make file(s). I eventually found the #include for stm32f10x_conf.h and it’s in stm32f10x.h bounded by an ifdef on USE_STDPERIPH_DRIVER. You need to add this to configuration->compile->defined symbols. It would be a good idea to check all the cflags in the makefiles and see whether there are more you need to add to defined symbols. This is probably where your problems lie.

Regarding CoIDE:
In addition to the USE_STDPERIPH_DRIVER define, you also need to set the Optimization level to (-Os). Doing these two things allow all of the .c files to compile with only the normal warnings.

BTW, one of the ugly things is that it took about 2 hours just to manually input the whole file tree, files, etc.  If anyone wants to try this, PM me, and I’ll send you an xml file that might help.

I’m now trying to link. I can get CoIDE to use the Spark linker script by unchecking “Use Memory Layout from Memory Window”. Using the file selector to the right of the “Scatter File” box I select the linker script. The stock linker script will cause a CoIDE internal string error. I had to change every occurrence of “FLASH” to “rom”, and every occurrence of “RAM” to “ram”. I also stripped out all of the extended memory stuff, but that may not be necessary.

I checked “Discard unused sections” and “Don’t use the standard system startup files” boxes.

“Library” needs to be set to “Retarget”.

Misc Controls keeps it’s regular setting of “-lstdc++;”

It now compiles and links.

Seems like congratulations are in order :smile:

I wish it were so. But thanks anyway!

Maybe bad news on the CoIDE front. I managed to compile and link with no errors, but the resulting .bin wasn’t even close to what the “make” method generates, and it did not run on my Spark. Object files weren’t nearly the same. This results from CoIDE not actually using the regular makefiles but instead using it’s own internal make arrangement.

I looked at what the command string the compiler was getting in each instance, and they were radically different. Unfortunately I see no way to change that with CoIDE, as things are pretty inflexible. IF someone could figure out how to change the compiler command string in CoIDE maybe it would still have a chance.

Netbeans handles the whole thing differently. It actually uses the Spark makefiles, and therefore generates (from what I can tell) identical .bin output files. And it does program the Spark via USB easily. Just no debugging. Getting JTAG to work on that would be nice if we could do it.

I’ve started a thread asking Spark what they use internally. I’d like to see what happens there before doing much more on this.

Edit: I was just poking around the CooCox site. Strange that the CoIDE is based on Eclipse, and yet they say their IDE is close source. The EPL is a long read, but it appears to be more of a GPL kind of thing. I wrote CooCox asking for source, but don’t think there’s much of a chance.

I had success with the older version of Eclipse with importing spark makefile project, but the latest versions just say it’s not a valid C/C++ project, and then because it’s not a valid project, Eclipse won’t let you edit the properties to try and make it valid. Catch 22!.

Regardless, no version of Eclipse supports ST-Link other than via OpenOCD, which seems like a new career path all on its own. I believe Netbeans uses the USB port to load binaries so you are not going to debug there. That lead me to CoIDE.

You have a .bin from a clean compile and link. You gave a debugger. So what if its different or CoIDE won’t run spark’s make files. How far does the .bin get before it “doesn’t run”. Can you get thru hardware initialization, wlan initialization? Where is it failing? I would expect differences in binary sizes just on different libraries being linked in etc. For instance, when I switched CoIDE to use the C Nano library the data/bss size dropped by more than 1K.

I think CoIDE was designed to remove the complexity of Eclipse. and in that regard I like it very much, and it works very well if you create a new project from scratch using their components etc. Even if you can get Eclipse to work with Spark make files, you are probably going to have to get a J-Link JTAG adapter as well. I recommend persevering with CoIDE.