In preparation for the arrival of my first photo and Internet buttons, I’ve been reading example code as I’ve never played around with hardware before. Two quick questions:
I see lots of examples where loop() has no artificial delays introduced. If I were writing software for my laptop, I would try to avoid any spin loops to minimize CPU utilization and prolong battery life. Is this a consideration in the micro-controller world as well? I realize that loop() is called by the system-level firmware, so perhaps there is an artificial delay in there?
I was looking at same code for button presses on the Internet button. Again, this is going to show my hardware ignorance, but it seems like the pattern here is to read from an input that has been pulled up to see when in goes low. So, the main loop() is doing an if statement with a digitalRead() on the pin. My question is whether or not there can be a race condition where a button is pushed while the code is not executing that statement. What happens if the system loop takes a bit long as it’s checking for new instructions from the cloud? What happens if I introduce an artificial delay (1s) per my question above?
Thanks for entertaining these basic questions from a newbie. I can’t wait for my order to arrive to start testing and experimenting!