Thanks Walter and Steven for the insight. I guess I will post my
question to python main mailing list and see if people have anything
to say.
-Abhi
On Mon, Mar 26, 2012 at 3:28 PM, Walter Prins wrote:
> Abhi,
>
> On 26 March 2012 19:05, Abhishek Pratap wrote:
>> I want to utilize the power of
Abhishek Pratap wrote:
Hi Guys
I want to utilize the power of cores on my server and read big files
(> 50Gb) simultaneously by seeking to N locations.
Yes, you have many cores on the server. But how many hard drives is each file
on? If all the files are on one disk, then you will *kill* perf
> I want to utilize the power of cores on my server and read big files
> (> 50Gb) simultaneously by seeking to N locations. Process each
> separate chunk and merge the output. Very similar to MapReduce
> concept.
>
> What I want to know is the best way to read a file concurrently. I
> have read ab
Hi Guys
I want to utilize the power of cores on my server and read big files
(> 50Gb) simultaneously by seeking to N locations. Process each
separate chunk and merge the output. Very similar to MapReduce
concept.
What I want to know is the best way to read a file concurrently. I
have read about