Driving along the highway, Mr. Smith notices that signs for Flatz beer appear to be spaced at regular intervals along the roadway.
He counts the number of signs he passes in one minute and finds that this number multiplied by 10 gives the car’s speed in miles per hour.
Assuming that the signs are equally spaced, that the car’s speed is constant, and that the timed minute began and ended with the car midway between two signs, what is the distance from one sign to the next?.
Martin Gardner, 1956
For convenience sake, suppose the speed was 60 miles per hour. Then he'd travel 1 mile in one minute. If he passed 6 signs we'd have a solution since 6*10=60. So the signs are 1/6 mile apart.
If the speed were 30 miles per hour he'd only cover half the distance and so only pass 3 signs. This still works as 3*10=30.
If the timed minute began, but did not end, with the car midway between signs. There would be a +/-5mph error.
Edit: I catch a lot of typos before I hit send. Not all of them.
Edited on April 27, 2016, 8:27 pm
|
Posted by Jer
on 2016-04-27 13:14:08 |