A professor makes a long computer run every week. The run is divided into 100 parts which may run anywhere from a few seconds to several minutes. At the end of each part, the computer briefly displays how long that part took. After a few hours the professor checks the run's progress. When the current part finishes, the professor multiplies its time by 100 to estimate the total time for the run.
The professor finds that this estimated time is usually longer than the actual time. Why?