The Spark Core Firmware

@BDub This is a known bug with make-3.81 in cygwin when generating dependencies. Its been around for at least 6 years and there seems to be no interest in it getting fixed upstream.

If you build in the mingw shell (for example, using git-bash) or build in linux then you wont encounter problems.

3 Likes

I tried git-bash and it starts to compile but it fails after a bit about not being able to find a file.

Linux is cool and all, but not that practical for me.

I guess I’ll just work with my batch files and deal with longer build times.

@bdub - that is ugly. I have a fresh win8 virtual machine online, I will be trying the dev tools out over the next day or so. If I can help you with this, I will!

Ken

1 Like

Which version of make is getting called? Are you sure that gnu make is first on your path, that looks like make from visual studio, which is not going to be compatible with a gnu makefile.

Anyone able to make it work on Windows platform?
I tried setting up and it always give “make: *** [check_external_deps] Error 2” when recompiled.

Yep mine is working on Win 7 32-bit, just not the most ideal way.

Did you install http://gnuwin32.sourceforge.net/packages/make.htm ?

Did you clone all three repos?

Try building from windows command prompt and also git-bash. It only works in cmd prompt for me, but YMMV.

Hi I have two questions?

Upload for new firmware passes good, but whether there is some problem because the following message appears?


File downloaded successfully
Transitioning to dfuMANIFEST state
Error during download get_status

And whether the flash function through cloud simultaneously update core_firmware of SparkCore, or should manually make if necessary?

It always says this error, but if it says ‘File downloaded successfully’ then it actually worked :smile:

If you do a flash from the cloud, it will overwrite the entire firmware file that you flashed over USB.

Hello Zach!
Thanks for the rapid response :smile:
This means that there is no need to manually make a update?
Always I get the latest firmware?

I'm just quibbling with your "latest firmware" statement. :blush: Yes, you will get the latest released firmware which I believe is the "compile-server2" branch (see below). Since much of this thread is talking about compiling your own fw (possibly from other branches), I figured I'd mention that.

github has a nice interactive "graph" which shows you where the different branches are (see here) Here's a screenshot showing the compile-server2 which was "released" in late December.

@dorth is correct - a new build will be uploaded to the web IDE by the end of the week. In general we’ll be trying to push new firmware to the cloud every two weeks.

3 Likes

That is awesome. It is really good to see how connected you guys are to this as a team.

I get this error when building on Ubuntu 12.04.



Building file: …/src/spark_wiring_i2c.cpp
Invoking: ARM GCC CPP Compiler
mkdir -p obj/src/
arm-none-eabi-gcc -g3 -gdwarf-2 -Os -mcpu=cortex-m3 -mthumb -I…/inc -I…/…/core-common-lib/CMSIS/Include -I…/…/core-common-lib/CMSIS/Device/ST/STM32F10x/Include -I…/…/core-common-lib/STM32F10x_StdPeriph_Driver/inc -I…/…/core-common-lib/STM32_USB-FS-Device_Driver/inc -I…/…/core-common-lib/CC3000_Host_Driver -I…/…/core-common-lib/SPARK_Firmware_Driver/inc -I…/…/core-communication-lib/lib/tropicssl/include -I…/…/core-communication-lib/src -I. -ffunction-sections -Wall -fmessage-length=0 -MD -MP -MF obj/src/spark_wiring_i2c.o.d -DUSE_STDPERIPH_DRIVER -DSTM32F10X_MD -DDFU_BUILD_ENABLE -fno-exceptions -fno-rtti -c -o obj/src/spark_wiring_i2c.o …/src/spark_wiring_i2c.cpp

Building file: …/src/spark_wiring_interrupts.cpp
Invoking: ARM GCC CPP Compiler
mkdir -p obj/src/
arm-none-eabi-gcc -g3 -gdwarf-2 -Os -mcpu=cortex-m3 -mthumb -I…/inc -I…/…/core-common-lib/CMSIS/Include -I…/…/core-common-lib/CMSIS/Device/ST/STM32F10x/Include -I…/…/core-common-lib/STM32F10x_StdPeriph_Driver/inc -I…/…/core-common-lib/STM32_USB-FS-Device_Driver/inc -I…/…/core-common-lib/CC3000_Host_Driver -I…/…/core-common-lib/SPARK_Firmware_Driver/inc -I…/…/core-communication-lib/lib/tropicssl/include -I…/…/core-communication-lib/src -I. -ffunction-sections -Wall -fmessage-length=0 -MD -MP -MF obj/src/spark_wiring_interrupts.o.d -DUSE_STDPERIPH_DRIVER -DSTM32F10X_MD -DDFU_BUILD_ENABLE -fno-exceptions -fno-rtti -c -o obj/src/spark_wiring_interrupts.o …/src/spark_wiring_interrupts.cpp
…/src/spark_wiring_interrupts.cpp:59:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:60:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:61:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:62:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:63:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:64:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:65:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:66:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:67:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:68:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:69:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:70:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:71:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:72:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:73:7: error: expected primary-expression before ‘.’ token
…/src/spark_wiring_interrupts.cpp:74:7: error: expected primary-expression before ‘.’ token
make: *** [obj/src/spark_wiring_interrupts.o] Error 1
darcy@PXE:~/spark/core-firmware/build$

From …/src/spark_wiring_interrupts.cpp

//Array to hold user ISR function pointers
static exti_channel exti_channels[] = {
{ .handler = NULL }, // EXTI0
{ .handler = NULL }, // EXTI1
{ .handler = NULL }, // EXTI2
{ .handler = NULL }, // EXTI3
{ .handler = NULL }, // EXTI4
{ .handler = NULL }, // EXTI5
{ .handler = NULL }, // EXTI6
{ .handler = NULL }, // EXTI7
{ .handler = NULL }, // EXTI8
{ .handler = NULL }, // EXTI9
{ .handler = NULL }, // EXTI10
{ .handler = NULL }, // EXTI11
{ .handler = NULL }, // EXTI12
{ .handler = NULL }, // EXTI13
{ .handler = NULL }, // EXTI14
{ .handler = NULL } // EXTI15
};

My system info

darcy@PXE:~/spark/core-firmware/build$ uname -a
Linux PXE 3.2.0-58-generic #88-Ubuntu SMP Tue Dec 3 17:40:43 UTC 2013 i686 i686 i386 GNU/Linux
darcy@PXE:~/spark/core-firmware/build$ cat /etc/issue
Ubuntu 12.04.4 LTS \n \l

Anyone know what I missed to get this error?

I had problems in Win 8.1 also but found a solution. Check out THIS TOPIC.

Hope it helps!

:smiley:

2 Likes

When I get something right on my main PC, I can’t reproduce the success with my laptop. :confused:

In this case, actually the original guide (in the first post) is the actually the simplest method and works well. There was just few points to pay attention to:

  • Use Git Bash instead of Windows Command prompt. It installs with Git and runs from context menu (right mouse click).
  • As @Dave said, dfu-util must be 0.7 from http://dfu-util.gnumonks.org/releases/dfu-util-0.7-binaries.7z
  • Make sure everything is in Windows PATH variable so they can be found and run anyywhere
  • Do not install Git or Make to Program Files. Space in certain places breaks things in unix-style enviroment.

That last part took me whole saturday to figure out.

I redownloaded clean Spark sources from GitHub and it seems to build well without modifications to makefiles. First build with i3-laptop is slowish, but after that I only have to rebuild the application.cpp related changes, which is suprisingly fast compared to time it takes the online IDE to send me new firmware.

Cool, glad you got it building again! The original makefile was OS independent, but the much faster makefile has a few more requirements, as a Windows user myself, I’m looking forward to trying this out.

The build time on the build server is also about under a second, so if you’re watching your core, by the time your core flashes magenta once, the cloud has already transferred your code, built it, and has started sending chunks to your core. Most of the time is spent making sure all the packets get there safely, and making sure you have firmware to fall back on if the transfer fails. :smile:

Edit: Which is to say, we’ve been brainstorming on ways to make the OTA updates faster, and I think we have some good solutions in the pipeline to speed that up significantly.

It just occurred to me that the speed of the reflashing must be related to the speed at which the CC3000 and STM32 are working together to read bytes over TCP. And if the firmware is about 70KB or so, then the transfer speed is a lot slower than I think it could be… sounds like 1 - 2KB per second (my flashing process takes about 45 seconds), and should be way faster, at least 30KB/s or more.

this is due to the gcc version
seet this post https://community.spark.io/t/solved-firmware-compile-error/794

1 Like

A big component of the OTA delay has to do with the small receive buffer size (256 bytes, so more ram is left available), and the positive confirmation message after every package is received. This means on a medium latency network, say ~100ms ping, you have to wait an extra ((70000/256)*100)/1000 ~= 27 seconds. The trade off here is this is very robust in a high-error environment, but slow in a clean environment. My plan was to try to carefully send more than one chunk at a time in a way as to not overwhelm the core. I think this is possible, but it requires some careful coding, and it needs to be backwards compatible as well. :smile:

1 Like