I’m working on my first Spark project, which involves using the Spark to “control” an analog slave clock (the type you probably had in your grade school). This clock requires the correction coil to be activated for several seconds near the end of each hour, and for another period of time toward the end of the 5 PM and AM hour.
I’ve got all that pretty much figured out, but the problem I’m having is how to set the time on the Spark so that it sets my clock properly. I’m familiar with the time functions given in the documentation, but I’m not sure what the best method would be to make the Spark DST-aware. I’m thinking that I might have to write a if-then statement into my code to change the Time.zone() variable based on date, but that gets complicated since I’d have to use other variables to determine, for example, the second Sunday of March (i.e. increment the variable if it’s March until it reaches 2), then change the value passed into Time.zone() accordingly. This is definitely do-able, but it seems like a poor allocation of resources, and resources are important on a small piece of hardware like this.
Does anyone have more savvy recommendations on how to deal with changing the time zone to match DST (I should probably mention, this is US DST)?