You need to go from point A to point B, and then back to point A. Points A and B are 20 miles apart. You go to point B at a constant speed of 15 miles per hour. If you want your overall speed to be 30 miles per hour, how fast would you have to go from point B back to point A?
(In reply to solution
Unfortunately, speeds do not average linearly over distance. Traveling distances D1 and D2 at speeds R1 and R2, takes times T1=D1/R1 and T2=D2/R2, resp. The average speed is thus (D1+D2)/(T1+T2)= 1/(W1/R1+W2/R2) where W1=D1/(D1+D2) and W2=D2/(D1+D2). For this problem W1=W2=1/2 so the average speed is 1/(1/2R1+1/2R2) which is known as the "harmonic mean" of R1 and R2 -- it is the reciprocal of the ordinary mean of the reciprocals. Thus with R1=15 and R2=45, the average speed would be 1/(1/30+1/90)=1/(4/90)=22.5. It is clear that 1/(1/2R1+1/2R2) < 1/(1/2R1)=2R1 -- to get the average speed to equal 2R1 would require R2="infinity."
Edited on March 12, 2004, 4:30 pm
Posted by Richard
on 2004-03-10 20:17:40