What’s the easiest way/command to allow me to do a
git pull to get the latest firmware from the
core-firmware repo, without overwriting my own
application.cpp with the tinker app each time?
What’s the easiest way/command to allow me to do a
@dermotos I just have a command line script that copys my app.cpp into core-firmware/src as application.cpp and then runs ‘make’.
I think you can have your own git repo, and set our repo as an upstream?
git remote add upstream email@example.com:spark/core-firmware.git git fetch origin -v; git fetch upstream -v; git merge upstream/compile-server2 #(note: I found that command here: http://gitready.com/intermediate/2009/02/12/easily-fetching-upstream-changes.html )
I posted about this here: https://community.spark.io/t/when-to-clone-or-update-local-copies-of-spark-firmware/2314/2
This is probably the best option, once I have multiple projects setup. I’ll look into it, thanks!
The Alias mentioned here is very helpful too:
Just throw this in your
.gitconfigand you’re all set:
pu = !"git fetch origin -v; git fetch upstream -v; git merge upstream/master"``
git puwill grab all of the latest changes from both remotes, and then merge in the commits from upstream.
- Line indentation (tab or spaces) If spaces how many?
- Submit compiled binaries or not? If not, how do we exclude them from the pull request easily so that we can at least compile and test ourselves prior to submitting.
- I’ve seen mention to running
make testfor changes to
core-common-lib. Can you elaborate more on what you would like us to do if submitting changes for this library?
- Is it preferred to create changes starting from the latest
- Any advice for merging conflicts… recently because the merge conflicts were so twisted I had to just bail on my changes, delete my branch locally and remotely, fork core-firmware again, create new branch and splice in changes again. A real pain to say the least! And
git mergetoolis not even available to me. Which one should I get?
Great questions. Thank you for asking. Here are my recommendations:
Use spaces not tabs for indentation. How many spaces? Stay consistent with the surrounding code. Usually it’s 4, sometimes 2. Consistency and readability are more important than any dogmatic code style. Do not mix “just whitespace” changes in a pull request with significant code changes. Too much noise makes code review a huge pain.
Go ahead and submit compiled binaries. I understand removing them just for the PR is a pain. We’ll always recompile in testing.
core-communication-lib(not common lib or any other C/C++ repo) has tests using UnitTest++. If you add a new function or fix a bug, create at least one test case either in one of the existing files, like TestSparkProtocol.cpp or in a new file named something like TestMyNewFeature.cpp or TestCrazyBugX.cpp. Ideally, you would:
- write the test case first, before any code
- run the tests (
make testin the root of the communications lib), see the compiler fail
- then write the declarations in a header file that satisfy the compiler
- run the tests to see the linker fail
- write a bare function that does nothing
- run the tests to see the linker satisfied but the actual test case fail
- THEN write the minimal code that satisfies the test
- rinse, repeat
Base changes off the latest master branch!
Merging… is always fun. I don’t have any tips. I just do it by hand, very carefully.
I come back here from time to time to get this upstream command I keep forgetting
git remote add upstream xxx shouldn’t be that hard to remember that but I’m not using it very often… anyhoo, I thought I would look into these unit tests again and I’m getting an error
make test in the latest master I get the following error:
D:\Spark\core-communication-lib>make test src/spark_protocol.cpp: In member function 'bool SparkProtocol::add_event_handler(const char*, EventHandler)': src/spark_protocol.cpp:565:67: error: 'strnlen' was not declared in this scope const size_t FILTER_LEN = strnlen(event_name, MAX_FILTER_LEN); ^ src/spark_protocol.cpp: In member function 'bool SparkProtocol::handle_received_message()': src/spark_protocol.cpp:1088:89: error: 'strnlen' was not declared in this scope const size_t filter_length = strnlen(event_handlers[i].filter, MAX_FILTER_LENGTH); ^ make: *** [src/spark_protocol.o] Error 1
#include <string.h> is declared at the top of the file… argument types seem legit. Any chance this is just some weird windows make file problem I’m having?
Hmm, we did add those newlib nano stubs, so maybe it’s a GCC version thing? Have you tried upgrading to 4.8? - https://launchpad.net/gcc-arm-embedded/+download
4.8 2014q1 already, could upgrade to q2 I suppose. What happens on your Macs when you run
make test ? Anything like what I’m seeing?
I still get the expected output, like this:
running unit tests... tests/TestSparkProtocol.cpp:205: error: Failure in EventLoopRespondsToUpdateBeginWithACK: false tests/TestSparkProtocol.cpp:211: error: Failure in EventLoopPreparesForUpdateUponUpdateBegin: false tests/TestSparkProtocol.cpp:216: error: Failure in EventLoopRespondsToUpdateBeginWithUpdateReady: false tests/TestSparkProtocol.cpp:221: error: Failure in EventLoopRespondsToChunkWithACK: false tests/TestSparkProtocol.cpp:226: error: Failure in EventLoopRespondsToChunkWithChunkReceivedOKIfCRCMatches: false tests/TestSparkProtocol.cpp:231: error: Failure in EventLoopRespondsToChunkWithChunkReceivedBADOnCRCMismatch: false tests/TestSparkProtocol.cpp:237: error: Failure in EventLoopSavesReceivedChunk: false tests/TestSparkProtocol.cpp:242: error: Failure in EventLoopRespondsToUpdateDoneWithACK: false tests/TestSparkProtocol.cpp:248: error: Failure in EventLoopFinishesFirmwareUpdateOnUpdateDone: false tests/TestSparkProtocol.cpp:253: error: Failure in EventLoopSendsChunkMissedOnTimeout: false FAILURE: 10 out of 134 tests failed (10 failures). Test time: 0.11 seconds. make: *** [test] Error 10
Those 10 failures are just stubs for tests that haven’t been written yet.
make clean testclean sslclean test
Thanks for the feedback @zachary. I still get the same error with this new command set… shakes fist at Windows
This is the kind of thing a nice Spark VirtualBox Toolchain installation would hopefully fix… I’m not too concerned this doesn’t work at the moment, so please don’t worry much more about it. In the future I would like to do more Unit Testing and thought this would be a good example.
D:\Spark\core-communication-lib>make clean testclean sslclean test make: [clean] Error 1 (ignored) make -C tests/UnitTest++ clean make: Entering directory `D:/Spark/core-communication-lib/tests/UnitTest++' make: [clean] Error 1 (ignored) make: Leaving directory `D:/Spark/core-communication-lib/tests/UnitTest++' make -C lib/tropicssl/library clean make: Entering directory `D:/Spark/core-communication-lib/lib/tropicssl/library' rm -f *.o libtropicssl.* make: Leaving directory `D:/Spark/core-communication-lib/lib/tropicssl/library' src/spark_protocol.cpp: In member function 'bool SparkProtocol::add_event_handler(const char*, EventHandler)': src/spark_protocol.cpp:565:67: error: 'strnlen' was not declared in this scope const size_t FILTER_LEN = strnlen(event_name, MAX_FILTER_LEN); ^ src/spark_protocol.cpp: In member function 'bool SparkProtocol::handle_received_message()': src/spark_protocol.cpp:1088:89: error: 'strnlen' was not declared in this scope const size_t filter_length = strnlen(event_handlers[i].filter, MAX_FILTER_LENGTH); ^ make: *** [src/spark_protocol.o] Error 1