I’ve just started trying to flash my project work locally over usb as it’s too time consuming to flash the core with the IDE over the cloud and I’m missing versioning and some advanced editing capabilities.
I have a project with a number of files that include support for nrf24l01’s over SPI amongst other things, and it all works nicely when flashing via the cloud IDE so I decided to see if I could save some time and program the core over usb instead to enable me to iterate faster.
To start I followed the guide on the github core-firmware project, cloned the repos, installed the deps and ultimately built the tinker app. This deployed nicely to my core and worked a charm.
Next I looked at a number of approaches to include my project locally but ended up taking the simple route (or so I thought) of updating application.cpp and including my other files in /src and /inc. The first thing I noticed is that I had to declare prototypes for the functions in my main application “.ino” file
Is this processed by the cloud IDE compiler in a similar fashion to the Arduino IDE? or am I missing something
After adding the prototypes, I managed to get my project to compile and was able to flash the core successfully, however the core attempts to start up, led flashes green and then flashes a series of red flashes (3fast,3slow,3fast, then about 8 slow) and then resets itself and the process repeats…
I then commented out the body of my loop code and replaced it with “delay(2000);Serial.print(“hello”);” retried… same problem, then I selectively identified (by commenting out various blocks of code in my setup method) that simply having “pinMode(D7, OUTPUT);” would cause this problem - remove the line and it worked fine.
Note: If I uncomment the rest of my project it flashes red again, but the above I hope will indicate to someone what I might have wrong!
Has anyone any ideas or tips of what I’m doing wrong? not doing?
Which local branch of the firmware are you using? If it works good with the online, try checking out the compile-server2 branch for all 3 repos and rebuilding.
git fetch
git checkout compile-server2
Try these examples for NRF24 stuff, they should compile for you follow the instructions:
You are seeing the SOS panic code. The number of flashes after the SOS tells you which panic you have and if it is 8 flashes, that means OutOfHeap. You must have dynamically allocated objects or data and you were close to being out of memory on the webIDE but just under enough to work. In the latest master branch, there must be just a bit less memory to work with and the code runs out.
When you compile locally, there are lines output that allow you to calculate the amount of flash used and statically allocated RAM. It looks this:
arm-none-eabi-size --format=berkeley core-firmware.elf
text data bss dec hex filename
97932 3024 13044 114000 1bd50 core-firmware.elf
In this output, text+data = flash used and data+bss = RAM used. Off the 20K of RAM 4K is preallocated so you get 16K for data+bss.
As BDub said, you could switch back the compile-server2 branch for your repos but you may be back up against a wall again when the current master branch moves to the compile-server2 branch.
It is also possible that the current master branch, which is like a beta testing branch for new code, has some issues right now that could be fixed in the future.
If you can share the libraries you are using, we might be able to help more.
Thanks @bko, RAM looks to be the problem when using master as you suggest. It's currently just over 16K atm and it does work with the compile-server2 as @BDub suggested.
master:
arm-none-eabi-size --format=berkeley core-firmware.elf
text data bss dec hex filename
83676 2984 13840 100500 18894 core-firmware.elf
compile-server2:
arm-none-eabi-size --format=berkeley core-firmware.elf
text data bss dec hex filename
92620 2976 13756 109352 1ab28 core-firmware.elf
Looks like I'm going to have to trim the fat, I'll do some tests and make sure it also works on master...
Really great support guys, I was pulling my hair out!
Just done a round of tests with 500 bytes less ram usage myside and cannot get master branch to work properly - seems like there is a bug there somewhere. I now get an SOS panic with a single flash which seems to indicate a “HardFault” from the src…?? Is there anything that I can do to help debug that?
Anyhow, I’ll revert to using the compile-server2 branch for now as it seems to work fine for now.
You can totally set the "DEBUG_BUILD" environment variable to 'y' to get better debug flashing output. I can't find official documentation on those parameters, so I'm guessing that's still in the queue, but there are threads about them. Right now I think this is the best documentation for that:
That looks a very good diagnostic job guys. It also illustrates something that is bothering me a bit - the Spark core software is getting better and better but it is leaving less and less space for our applications. Un-commenting out the #define USE_ONLY_PANIC in Spark_Firmware_Driver config.h buys another 16k of program space and I can live with that but at least for me it is RAM that is in short supply whether for heap or global variables. For the Energy Monitor I had to save my data as 8bit arrays losing the advantage of the 12 bit ADC and I’m really struggling to fit in a decent vibration sample and enough space to FFT it. I can squeeze in 256 samples (= 512 numbers) as floats but not 512 and that is by using a primitive but small footprint FFT. I have yet to add my analysis code …
@phec
For memory/processor intensive applications like signal processing, may I suggest using a co-processor to do the sampling/filtering and using the Core as a communication channel to relay back the information via the cloud? I know this sounds a little counter-intuitive, but you might get best of both the worlds.
Yes I follow. I’d just as soon not use a co-processor as I’m also keen on the low power/sleep capability of the Spark as it will be battery powered in an apiary and just wake up hourly to listen to the bees and check the conditions in the hive.
As things are I think I can just fit things in. (I’ll know for sure when I take delivery of two more Sparks as my existing one is now permanantly tied up controlling my home energy.)
Surprisingly I can also fit a 256 point FFT onto an Arduino Uno and because there is a direct ADC access library it can sample faster than the current Spark (Nyquist f = 20kHz vs about 10kHz), but of course the Arduino is several ADC’s short and doesn’t have wifi and the Spark has the potential to sample very much faster - up to around 100kHz I think.
Anyway roll on Chinese customs, let my Spark components pass.
@satishgn has been doing some work on our encryption libraries that might free up a bit more memory; it’s not yet integrated into the web IDE but I just wanted to throw it out there. There’s probably still quite a bit of room for memory/flash optimization, since we have spent very little time worrying about those constraints until recently. Had to get things working first
Of course @zach , I think it is amazing how fast you’ve got the enhancements to the web interface working and sorted those early communications issues out. I’m looking forward to getting a bee app running on my phone maybe with an alarm for the hive being knocked over and another for an impending swarm. I can get all the information I need for that at quite a slow sampling rate but the prospect of getting some ultrasonic stuff is interesting as I don’t think anyone has listened up there. I just had a dark moment when I had visions of the perfect web interface and no space to write an application. But having managed to fit Fermi surface electron transport calculations in 3k of memory on a KDF9 (ask your grandfather) a few bees buzzing in the Spark should be easy.