The writer makes an implicit assumption you just glance at the clock with no prior info.
Where seconds count, you can watch the clock until the moment it ticks. At that time you have greater precision. And due to your own innate sense of time, that precision decays for a little while (maybe even 30s?) rather than instantly disappearing.
I like to synchronize my clocks this way (when seconds are unavailable). Yes, it does mean I invest up to a minute more to set the time, and it's probably not worth it if you're doing so often (eg. area prone to power outages).
The same technique is how computer RTCs are set/read, but to sub-second precision instead of minute precision, because the most common register standard allows only storing whole seconds but they keep relatively accurate time.
The writer makes an implicit assumption you just glance at the clock with no prior info.
Where seconds count, you can watch the clock until the moment it ticks. At that time you have greater precision. And due to your own innate sense of time, that precision decays for a little while (maybe even 30s?) rather than instantly disappearing.
I like to synchronize my clocks this way (when seconds are unavailable). Yes, it does mean I invest up to a minute more to set the time, and it's probably not worth it if you're doing so often (eg. area prone to power outages).