I have been using
Time.now() to provide a timestamp when created that goes into events published by devices. I have a cloud function to enable the time zone and DayLight Saving Time to be set for a device. This has all worked well however recently at a demonstration of some of the devices in Amsterdam (Timezone +1.00) the timestamp has been incorrect. I looked back at using
Time.local() thinking from the name that this might help. However, it doesn’t it just doubles the Timezone and DST settings, for example 13.58 local (TZ 0.00 and DST off) 15.58 local (TZ 1.00 and DST off) 17.58 (TZ 1.00 and DST on). The documentation, to its credit, does warn of using local(). So what was the thinking behind
local() and why does
Time.format() mess up the value provide by
Time.local() and not
I have been using
It does have its use for anybody who’s using the raw UNIX epoch count without employing any of the higher level methods of the
int currentSecondOfDay = Time.local() % 86400;
Time.local() this will render the correct number of the current second in regards to your local time.
This - for example - is the way how I usually calculate the sleep duration relative to some absolute wake time in the future.
Time.format() does not mess up the time, but providing the wrong timestamp to it does.
Time.format() is meant to work with UTC - irrespective of your location. So to get the correct “translation” you need to provide UTC timestamps - equivalent to
Time.now() but to get the current time, you should not provide the timestamp at all, as there is a dedicated overload that is meant for exactly that use (
Time.format(<formatSpecifyer>) - no timestamp provided).
When you provide a timestamp derived from
Time.local() you only provide an anonymous number (including the time zone offset) which
Time.format() assumes to be UTC and hence hands you back the string for the provided UTC time.
@ScruffR, as ever, thank you for your reply. I’d like to just check I have understood your examples. For the
currentSecondOfDay example: I have requested a scheduled ‘on’ time of 5pm (local) everyday = 176060 - 1 @ 61199 seconds and then I keep checking
Time.local() % 86400 until it is = 61199? Why is this any better than
Time.now() % 86400? Apologies, I haven’t tried sample code for this.
Time.format() are you suggesting that
Time.local() can be used with it but needs a specific format specifier or the
Time.now() need not be provided as a parameter to
You do calculate 5:00pm as
(12+5)*60*60 = 61200 but lets assume you live in a time zone -6 hours, so when it is 5pm at your location
Time.now() % 86400 will not give you 61200 but 82800, since UTC - the time your device will be synced to via the cloud - is already 6 hours ahead of you.
But - as if by magic -
Time.local() will provide you with 61200 (as long you have set up your
I haven’t either, but I think if you did, you might have found the answer without the holding hand
BTW, when checking for trigger time, I’d always check for
>= and not for
== to account for any delay in execution (e.g. when your code blocks due to connection loss and reconnection time).
Bit of a mistake with -1 on the seconds for 5pm! I would use
>= also as the check and not
==. 100% got it now - it is a pity that these topics do not get included in the documentation. I usually start with that and then search for topics relevant. I haven’t seen your example of using
Time.local() anywhere else!