A computer can process 100 records of data in 200 milliseconds. If a millisecond is 1/1000 of a second, how many seconds does it take to process a file of 1000 records?
Solution 2 (using unit analysis)
Use unit conversion technique presented later in this chapter.
• One millisecond is 1/1000 second gives two reciprocal conversion factors: 1 / 1000 (seconds / millisecond) 1000 (milliseconds / second)
• 100 records processed in 200 milliseconds gives two reciprocal factors: 100 / 200 (records / millisecond) 200 / 100 (milliseconds / record)
• 1000 records in the file gives two reciprocal factors: 1000 (records / file) 1/1000 (file / record)
• Now, select the appropriate factors to find number of seconds to process the file:
x (seconds / file) = 1 / 1000 (seconds / milliseeertd)
* 200 / 100 (frtilliseeerids / reeefd)
* 1000 (records / file)
x (seconds / file) = (1 / -IOW) * (200 / * -1-000 (seconds / file)
= 2 (seconds / file)
Thus, it takes 2 seconds to process the file.
You might also like to view...
The World Wide Web is a subset of the Internet that exchanges pages written in HTML
Indicate whether the statement is true or false
The _______ file extension has executable code used by multiple applications
Fill in the blank(s) with correct word