All about flooble | fun stuff | Get a free chatterbox | Free JavaScript | Avatars    
perplexus dot info

Home > General
Computer Run (Posted on 2024-06-08) Difficulty: 3 of 5
A professor makes a long computer run every week. The run is divided into 100 parts which may run anywhere from a few seconds to several minutes. At the end of each part, the computer briefly displays how long that part took. After a few hours the professor checks the run's progress. When the current part finishes, the professor multiplies its time by 100 to estimate the total time for the run.
The professor finds that this estimated time is usually longer than the actual time. Why?

No Solution Yet Submitted by K Sengupta    
No Rating

Comments: ( Back to comment list | You must be logged in to post comments.)
Some Thoughts The reason | Comment 1 of 2
Suppose the time it takes each part to run is Tk for k 1 to 100.  And the total time for all 100 parts is 10 hours.  The time the computer spends on the k-th part is Tk/10 not 10 hours/100.  
Although he may check at a random time, it is more likely that the computer will be on long part than a short one.
  Posted by Larry on 2024-06-08 09:33:07
Please log in:
Login:
Password:
Remember me:
Sign up! | Forgot password


Search:
Search body:
Forums (0)
Newest Problems
Random Problem
FAQ | About This Site
Site Statistics
New Comments (3)
Unsolved Problems
Top Rated Problems
This month's top
Most Commented On

Chatterbox:
Copyright © 2002 - 2024 by Animus Pactum Consulting. All rights reserved. Privacy Information