Millis() and micros() lose time when interrupts are disabled

Based on this thread and this implementation, I don’t think micros() is any less prone to missed ticks issue except at the sub-millisecond level. Since I’m trying to time 1000/60 = 16.666…ms, I’m seeing the same problem as when I use millis(). Granted, it’s not a bug or anything - I just wanted to confirm that this is behavior I should be seeing. Since the timing of the one-wire protocol is deterministic, I can easily work around the lost time.

Thanks!