It is well known that a stopped clock gives the exact time twice a day, while a clock that gains or loses time may not be right more than once over a period of months.
My clever father adjusted his clock to give the correct time at least twice a day, while running at the normal rate.
Assuming he was not able to set it perfectly (a reasonable assumption) how did he do it?
The father could have added a periodic motion to the general motionsay in the form of a sine function with an amplitude of say 3 minutes either side of the normal time. So, while during the day the clock would initially be as much as 3 minutes ahead and as much as 3 minutes behind at some times during the day, it would pass through the correct time a couple of times also, and its average motion would indeed be the normal rate.
It would keep being right twice a day (or more if the cyclic interval were shorter than one day), until the inaccuracy of the "normal rate" brought the 6minute span completely outside the true time moment. Then the clock would need to be reset, but this is just like normal clocks that need to be reset once in a while when they deviate by too much from the actual time.

Posted by Charlie
on 20050624 13:20:27 