A computer can process 100 records of data in 200 milliseconds. If a millisecond is 1/1000 of a second, how many seconds does it take to process a file of 1000 records?

Solution 1 (using step-by-step reasoning):

• If it takes 200 milliseconds to process 100 records, it will take 10 times longer (2000 milliseconds) to process a file of 1000 records
• 2000 milliseconds is the same as 2 seconds, because it takes 1000 of the little 1/1000 second parts to make one second.

Thus, it takes 2 seconds to process the file.

Notice: The above reasoning used reciprocals of the information provided as input.
• If 100 records are processed in 200 milliseconds,
then it takes 200 milliseconds to process 100 records.
• If a millisecond is 1/1000 second, then one second is 1000 milliseconds.
Thinking in reciprocal terms makes the reasoning easier.

Computer Science & Information Technology

You might also like to view...

Which of the following is not true of a class that will be used to instantiate function objects?

a. It must overload the parentheses operator. b. It can have data members. c. It must be a concrete class. d. All of the above are true about a class that will be used to instantiate function objects.

Computer Science & Information Technology

Filter By ____________________ is the fastest way to filter records for an exact match.

Fill in the blank(s) with the appropriate word(s).

Computer Science & Information Technology