[time-nuts] Visual clock comparison

Bill Hawkins bill at iaxs.net
Sun Apr 19 18:10:45 EDT 2015

The question implies simultaneous observation, but it could be resolved
by phase change over time.

All it requires is a means to start both second hands from the same
reference point at the same time. Once released, the error between them
will grow to something easily measured as the minutes go by.

Or you could take a picture of both clocks, wait some minutes and take
another picture.

Simultaneous observation is unlikely, because it takes time to move the
eye's focus from one clock to the other.

Bill Hawkins

-----Original Message-----
From: Chris Albertson
Sent: Sunday, April 19, 2015 1:24 PM

I think the question really was "How close must two visual clock
displays be to be perceived as being exactly in sync?".  Some people
(but not me) can see a 1/10 second difference and to me a one second
difference is obvious.  The answer is likely between 1.0 and 0.1
seconds.  But if you add a "tick" every second then the 1/10 second
difference is very easy to hear but most people can't hear a 1/40th
second difference.

More information about the time-nuts mailing list