A car is travelling in one direction at 80 km/h and a fly is coming from the opposite direction at 5km/h. (So its velocity is -5 km/h, since it's in the opposite direction.)
The fly hits the windshield of the car and is now travelling at 80km/h. In order for the fly to achieve the speed of 80km/h it's speed must have passed from -5km/h to 80km/h. (Meaning it must pass through the speed of zero) Therefore if the fly passes through a speed of 0km/h and the car is in contact with the fly the car must also pass through that speed of 0km/h.
This seems to mean that everytime a car gets hit by a fly it will completely stop. Why is this not so?
It seems to me that there is a fallacy in the problem itself. The fly has a velocity of 5 km/h. The car has a velocity of 80 km/h. Where the heck do you come up with -5 km/h? If they are approaching each other, then relative to the car the fly has a velocity of 85 km/h (and the car has the same velocity for that matter, in relation to the fly). And "relative" is the operative word here. Neither the fly nor the car were at negative velocity to begin with. I haven't come up with a solution to this one but I feel it is poorly stated.