vicky22291
BAN USERHey Guys!! What do you think of this approach?
First we take the starting numbers from each of 100 files and sort them which will take 100log100 which can be treated as a constant. Along with the number we will store the file descriptor from which file it is extracted.
Now treat the array as a queue, starting from minimum element, got to its file and extract all the numbers that are smaller than the next minimum and store or print them. Now remove the minimum from queue. update the minimum and next minimum. Keep the count of no of elements printed so far. Continue this process untill the queue is empty or count = K. if the queue is empty and count < k. Then rebuild the queue with numbers that are being currently pointed to by the file pointers. And repeat the process.
Pros: As block caching is implemented in OS during reading a file printing contiguous elements will be a major adv.
In worst case when the numbers across the files are sorted. The time complexity would be k*100*log100 which is the case with the above implementations also.
Please correct me if am wrong.
Correction the worst case time complexity is k*log100.
- vicky22291 May 09, 2014