Local Cloud Firmware Flash

Hello Everybody!

I am working on a IoT project here at my university. Because the university internet is WPA-Enterprise, we are having difficulty connected our Cores to the Internet. Instead, we have a router hooked up, and are attempting to utilize a local cloud that is not connected to the internet. I followed the instructions found here:

and got everything working properly as far as I can tell. However, now I want to prove that I can flash code from Netbeans onto the core. I was wondering if anyone could walk me through how to blink an LED by flashing the code locally. When I try to edit the application.cpp as shown in the video, I’m unable to successfully build the code.

Thanks,

Keith

Compile the code and flash via DFU-util should work.

You can even perform OTA flash via the local :cloud:

I replaced the application.cpp code with the one found here, which I believe is the working tinker firmware.

https://github.com/spark/firmware/blob/master/src/application.cpp

It can build it fine, but when I compile I get this error:

arm-none-eabi-g++.exe: error: /cygdrive/C/Spark/core-firmware/src/application.cpp: No such file or directory
arm-none-eabi-g++.exe: fatal error: no input files
compilation terminated.

COMPILE FILE FAILED (exit value 1, total time: 55ms)

Where did you place it and which repo branch are you using?

Also, where did you run make?

This is more of a compilation issue in Netbeans and I dont use it so someone else has to jump in :wink:

I have pretty much abandoned Netbeans at this point, as it was way over my head.

New question: I want to use Spark.publish() to send data (temp or something) from the core to a .txt file on my local cloud. Are there any resources for using Spark.publish() without being connected to the internet? All the examples I’ve found required an internet connection.

If you are running a local :cloud:, no internet connection is required.

You will be able to see your Spark.publish() stream using spark subscribe in Spark-cli

You can then write a script to listen to the events you are interested in and process accordingly :wink:

Or actually @wgbartley has been hacking on some cool Stats library and he might be able to suggest some stuff!

Ok awesome, thanks! So I’ve successfully been able to to run the Uptime code given here:

and I’ve been able to use the spark subscribe command in the CLI. I can’t figure out to log this data though. We tried something like:

C:\User\vsolar\Desktop>spark subscribe >testdata.txt

and we got our data to print into this .txt file on the desktop! Now we are looking to implement this sort of thing directly in the application.cpp script. We can’t figure out the proper syntax to tell spark.subscribe() to log the data to a specific place in C:\