A particle is travelling from point A to point B. These two points are separated by distance D. Assume that the initial velocity of the particle is zero.
Given that the particle never increases its acceleration along its journey, and that the particle arrives at point B with speed V, what is the longest time that the particle can take to arrive at B?
I don't have the solution but it seems like it's possible to find one that takes longer than constant acceleration. Here's what I found so far.
The longest time occurs when the particle just gets to the speed V at point B. If it had gone faster than V any time before the end point, it will have travelled a larger distance in the same amount of time. This is mainly because the speed cannot drop below V since it has to end at V and it can't accelerate after it decelerates.
For the constant acceleration (a0=V^2/2D), case, the time taken is 2D/V as can be seen in the previous comments. a=a0, v=a0*t, x=0.5*a0*t^2.
If acceleration is some function of t, a(t)<=a0. It must be equal to a0 at time 0; otherwise, it's meaningless to compare. a'(t)<=0 from the restriction that it doesn't increase. This means that the velocity function is concave down. Also, by integrating, we can see that v<=a0*t and x<=0.5*a0*t^2.
Basically, we're left with finding a function v(t) with the following criteria:
1. Concave down.
2. v(t)<=a0*t, a0=V^2/2D.
3. v(T)=V for some T>2D/V
4. The integral from 0 to T is equal to D
Graphically, this seems impossible, but I can't prove that. Anyone able to show that?
|
Posted by np_rt
on 2004-05-09 12:36:57 |