Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Computer Time Problem
- If a computer has a loop in it, the length of time it takes the computer to run the program varies linearly with the number of times it must go through the loop. Suppose a computer takes 8 seconds to run a given program when it goes through the loop 100 times, and 62 seconds when it loops 1000 times.
- a. Write the particular equation expressing seconds in terms of loops.
- b. Predict the length of time needed to loop 30 times; 10,000 times.
- c. Suppose the computer takes 23 seconds to run the program. How many times does it go through the loop?
- d. How long does it take the computer to run the rest of the program, excluding the loop? What part of the mathematical model tells you this?
- f. Plot the graph of this function.
Add Comment
Please, Sign In to add comment