How to video for compiling locally in Windows

OK Serial communications sorted. I messed around with zadig but couldn’t get it to load a virtual COM port for me. However when I followed @BDub’s instructions for installing the sparkcore.inf driver it worked perfectly happily alongside the zadig installed USB driver.
The process was:
press mode and click reset get yellow flashing LED
run zadig and install USB driver for Spark Core
run BDub’s Serial software so that the spark is waiting for serial input (I dare say listening mode would work too but I’m quitting while I’m ahead) and follow his instructions - briefly
go to control panel/system/device manager and find Spark Core
right click, install driver (delete existing driver first in my case after messing with zadig)

The zadig driver is there when the yellow LED is flashing
The sparkcode.inf driver is there when the cyan LED is breathing

It is a mystery to me but it works so many thanks to Bdub and @seulater

Now if some wise soul can get the Serial communications going through the Netbeans Terminal window that would be awesome.

1 Like

Hmm, all this time I thought I had deleted the spark_core.inf driver and the zadig was doing both core dfu and serial. I just checked, nope! Thanks for the correction :wink:

@seulater @BDub

Just wanted to thank you very much for the video. It allowed me a guy pretty new to the micro processor world to actually learn how to flash the spark core and how to pull firmware from Git Hub.

The only thing that tripped me up the first time was that when pasting the path = info which I copied from someplace (Not your Youtube Description) that had added a “-” to part of the path description which broke the path description.

So when I tried to type Make into the command prompt it would not run because the path was not setup right due to the dash. I copied the path data directly from the Youtube video description and it solved the problem.

So if your looking to load new firmware on the Spark Core and have no idea how to do it then this video will allow you to get the job done. You’ll learn a few things in the process also.

Thanks for making and sharing the video, without it I don’t think I would have ever figured it out.

1 Like

I have seen on several threads that people have issues with the Windows folder “Program Files (x86)” and the common cure is to install or copy things to different folders.
Where this does work, there is another even easier work around that keeps things in place and still saves us from having problems with the weird Windows naming.

You can just create a link to the “Program Files (x86)” folder that does not contain any disliked characters, setup your PATH refering to this link instead of “Program Files (x86)” and there you go.
Windows is still happy, anybody who might be looking for a program will find it where it is expected to live and you don’t have to have multiple instances of the same files or programs scatterd over several folders - apart from those two :rage: Windows ones.

Great vid! As I also said in the video response, youtube creates 1 problem:

path = C:\SparkCorePrograms\dfu;C:\SparkCorePro­-grams\Git\bin;C:\SparkCorePrograms\GNU Tools ARM Embedded\4.8 2013q4\bin;C:\SparkCorePrograms\GnuWin32­-\bin

The stripes added by youtube in the Git path (SparkCorePro - grams) and in the GnuWin32 - path are not visible when pasting in the windows settings but they do appear as a wierd E in the command prompt, and they make your path useless. After deleting those Es everything worked as it should :smile: Thanks a lot :smile:

1 Like

I create a chocolately package for all the dependencies. sparkcore-build
One command line call and everything is installed. Should speed up the setup process and remove the mistakes around environment variables.

Wow is local building and flashing way faster!

3 Likes

Local compilation tip …
Several code snippets I have seen on this site use types like uint8_t or uint32_t. These are defined in the header file #include <stdint.h> and make it quite clear how long your int is supposed to be.
If you are using local compilation you will get the compillation error message " ‘uint32_t’ does not name a type" unless you make sure that in your code you #include “application.h” before your own includes or explicitly #include<stdint.h> yourself.

1 Like

seulater… thank you… now for days I have been occupied with learning NetBeans. I have lots of things operating well, have been able to create separate projects, have projects with V3 SPARK FW and some with old versions to test backwards.
I have a new user question. application.cpp… I want to be able to create EACH project with a UNIQUE application name. I can not yet find how to assign names to this project file. Is this a “locked” parameter??? Can you point me to the method for defining a new name in NetBeans build/definition files???

Very nice tut!

I have a question.
How do i create new projects in NetBeans so it uses the needed firmware without editing it in all my existing projects?
If i copy an old project to create a new one with the same configurations,the files in both the old and the new gets updated when i edit in them… And how many of the files are actually getting uploaded to the Core? Is all the files from the src folder needed in all projects?

I feel dumb, but…

When I try to ‘Make’, I get an error: ‘make’ is not recognized as an internal or external command, operable program or batch file.

I’ve quadruple-checked the path, updated the path for ‘GNU Tools ARM Embedded\4.8 2014q1’ (from the previous … 4.82013q4).

I’m sure I’m missing something simple but cannot figure it out. Meanwhile I’m trying to install an embedded controller for a client that I built, but I need to compile the files locally so that I can fix them to work without the cloud.

Please help a poor old EE without much programming experience!

Thanks,

Shep

{EDIT}:
Oh SNAP. It was the dumb hyphen on the path setting from the YouTube page.

Never Mind.

1 Like

Hi, thanks for the excellent tutorial. I now exclusively use netbeans for spark dev becuase of this tutorial, it really has helped me out so thank you.

I have a few questions though. I’d like to create my projects in my own projects folder, I don’t really feel comfortable creating file inside core-firmware/src folder, isn’t this where the spark firmware lives and as such shouldn’t I put my file somewhere else then just link using a make file to the firmware libraries I need. For instance, being new to netbeans and spark core, it looks to me that whenever I create a project I’m pulling in everything from the spark core firmware, do I need to do this? Like, if I’m not using servos could I ommit that file from my projects make file?

I’m sorry to be ignorant but I really like the IDE and would like to be able to use it without feeling like I’m sort of stumbling round in the dark, I just need a little guidance on how to organize my projects from someone who knows. I would research it myself only I think what I need to know may be a little specific to the core firmware when used in netbeans.

Thanks again, Rick.

Wow @seulater thanks! I’ve tried other IDE’s and no go, but your great instructions on the video worked 1st time.

Can you tell me how to set up GDB in Netbeans so that I can debug via USB?

Particularly, what would the “Debug command” look like?

Thanks!

1 Like

I’m sorry, maybe it is a silly issue but I am stuck here. Everything was being awesome until 10:40 to build repositories. When I type “make” does not happen anything, instead appears “make” is not a command. Tha path is ok, everything until here was perfect. Could you tell me what to do in this case, please. I would appreciate. Regards.

Hi @afromero,

Hmm, if you’re compiling on windows and make isn’t working, make sure you’re in a command prompt that has the build tools in your path. Try looking for a “Bash command prompt” or a “git command prompt”, and make sure you’re in the “core-firmware/build” directory.

It’s possible you might have deleted your makefiles or something, so also try a git status to see if you’ve modified / removed those files, and you can use git to bring them back safely.

I hope that helps!

Thanks,
David

1 Like

Hello,
I did all the steps few months ago in my laptop and I did not have any problems.
Now I am trying to set other computer in order to develop Spark firmware from it.
I have found two problems:

  1. The dfu link does not work. I have taken my old file from my hdd.
  2. When I try to “build” (after doing the git pull) it cannot compile. I receive this message:
C:\Spark\core-firmware\build>make
Building core-common-lib
Building file: ../CC3000_Host_Driver/cc3000_common.c
Invoking: ARM GCC C Compiler
mkdir -p obj/CC3000_Host_Driver/
arm-none-eabi-gcc -g3 -gdwarf-2 -Os -mcpu=cortex-m3 -mthumb  -I../CC3000_Host_Dr
iver -I../SPARK_Firmware_Driver/inc -I../SPARK_Services/inc -I../SPARK_Services/
src/inc -I../STM32_USB-FS-Device_Driver/inc -I../STM32F10x_StdPeriph_Driver/inc
-I../CMSIS/Include -I../CMSIS/Device/ST/STM32F10x/Include -I. -ffunction-section
s -Wall -Wno-switch -fmessage-length=0 -MD -MP -MF obj/CC3000_Host_Driver/cc3000
_common.o.d -DUSE_STDPERIPH_DRIVER -DSTM32F10X_MD -DRELEASE_BUILD -c -o obj/CC30
00_Host_Driver/cc3000_common.o ../CC3000_Host_Driver/cc3000_common.c
process_begin: CreateProcess(NULL, arm-none-eabi-gcc -g3 -gdwarf-2 -Os -mcpu=cor
tex-m3 -mthumb -I../CC3000_Host_Driver -I../SPARK_Firmware_Driver/inc -I../SPARK
_Services/inc -I../SPARK_Services/src/inc -I../STM32_USB-FS-Device_Driver/inc -I
../STM32F10x_StdPeriph_Driver/inc -I../CMSIS/Include -I../CMSIS/Device/ST/STM32F
10x/Include -I. -ffunction-sections -Wall -Wno-switch -fmessage-length=0 -MD -MP
 -MF obj/CC3000_Host_Driver/cc3000_common.o.d -DUSE_STDPERIPH_DRIVER -DSTM32F10X
_MD -DRELEASE_BUILD -c -o obj/CC3000_Host_Driver/cc3000_common.o ../CC3000_Host_
Driver/cc3000_common.c, ...) failed.
make (e=2): El sistema no puede encontrar el archivo especificado.
make[1]: *** [obj/CC3000_Host_Driver/cc3000_common.o] Error 2
make: *** [check_external_deps] Error 2

Can anybody help me? Thanks

I am unable to flash using DFU.

0x08005000 is not writeable
Could anyone give me feedback for this issue?

You are using a Core command on a Photon. Here’s the full DFU commands: https://github.com/spark/docs/wiki/DFU-reference

1 Like

@kennethlimcp:
Thanks a lot for the useful link, now I am able to flash using DFU :smiley:

I did this to compile locally for a particle photon while following the video.

My make command is make PLATFORM=photon APP=MyApp
and run command is: dfu-util -d 0x2B04:0xD006 -a 0 -s 0x80A0000:leave -D C:\Spark\firmware\build/target/user-part/platform-6-m/myapp.bin

I put my device in dfu mode with the press of a button and the function System.dfu();

I would like to know if it’s possible and how to speed up a little bit the make? Right now I’m using the make PLATFORM=photon APP=MyApp but the make is loosing time checking the services-dynalib and hal-dynalib and many other things that I will never change. Is it possible to make only the “user part”? I’m pretty sure the answer is easy but I dont know how.

Thanks!

If you execute the make from the main directory instead of modules it will only build the app part which is significantly faster.