I'm relatively certain its possible, but then you have to deal with
locks, semaphores, synchronization, etc...
On Thu, Jul 2, 2009 at 12:04 PM, Sebastian Haase wrote:
> On Thu, Jul 2, 2009 at 5:38 PM, Chris Colbert wrote:
>> Who are quoting Sebastian?
>>
>> Multiprocessing is a python package tha
On Thu, Jul 2, 2009 at 5:38 PM, Chris Colbert wrote:
> Who are quoting Sebastian?
>
> Multiprocessing is a python package that spawns multiple python
> processes, effectively side-stepping the GIL, and provides easy
> mechanisms for IPC. Hence the need for serialization
>
I was replying to the
Who are quoting Sebastian?
Multiprocessing is a python package that spawns multiple python
processes, effectively side-stepping the GIL, and provides easy
mechanisms for IPC. Hence the need for serialization
On Thu, Jul 2, 2009 at 11:30 AM, Sebastian Haase wrote:
> On Thu, Jul 2, 2009 at 5:1
On Thu, Jul 2, 2009 at 5:14 PM, Chris Colbert wrote:
> can you hold the entire file in memory as single array with room to spare?
> If so, you could use multiprocessing and load a bunch of smaller
> arrays, then join them all together.
>
> It wont be super fast, because serializing a numpy array is
can you hold the entire file in memory as single array with room to spare?
If so, you could use multiprocessing and load a bunch of smaller
arrays, then join them all together.
It wont be super fast, because serializing a numpy array is somewhat
slow when using multiprocessing. That said, its stil
Is it possible to use loadtxt in a mult thread way? Basically, I want
to process a very large CSV file (100+ million records) and instead of
loading thousand elements into a buffer process and then load another
1 thousand elements and process and so on...
I was wondering if there is a technique wh