You traveled three hours at a certain speed. If you had taken 1 minute less to do each mile you would have gone 30 miles farther than you actually did.
How far did you go?
The answer is 60 miles. Here is the solution:
There might be a more intuitive way to think about this, but its too early for that in my book =) Instead of thinking of the rates as x mph and y mph, I thought of them as 1mile/x and 1mile/y where y is one minute less than x. So I decided to convert all of the time in the problem to minutes for simplicity.
R1 = 1 mile / t minutes
R2 = 1 mile / (t-1) minutes
180min*R1 = 180/t miles = D
180min*R2 = 180/(t-1) miles = D+30
180/D = t
180/(D+30) = t-1
180/(D+30) = 180/D 1 = (180-D)/D
180D = (180-D)(D+30) = 180D + 5400 D^2 30D
0 = D^2 + 30D 5400
D = [-b +/- sqrt(b^2 4ac)]/(2a)
D = [-30 +/- sqrt(900 + 21600)]/2
D = (-30 +/- 150)/2 = 60
So you traveled for 3 hours at a rate of 1mile/3min = 20 mph, covering a distance of 60 miles.
If you had taken one minute less to do each mile, you would have traveled for 3 hours at a rate of 1mile/2min = 30 mph, covering a distance of 90 miles, which is indeed 30 miles more than 60.
|
Posted by nikki
on 2004-11-29 14:03:58 |