Carcass wrote:
One computer can upload 100 megabytes worth of data in 6 seconds. Two computers, including this one, working together, can upload 1300 megabytes worth of data in 42 seconds. How long would it take for the second computer, working on its own, to upload 100 megabytes of data?
(A) 6
(B) 7
(C) 9
(D) 11
(E) 13
Computer A uploads \(\frac{100}{6}\) MB per Sec.
Computer A + B uploads = \(\frac{100}{6} + x = \frac{1300}{42}\) MB per Sec.
So B uploads \( \frac{1300}{42} - \frac{100}{6} = \frac{600}{42} \) MB per Sec.
so for finding 100 MB for computer B = \(100 * \frac{42}{600} = 7\) Sec.
Answer is B)