[Python-Dev] Do any of the 4 multiprocessing methods work using shared memory?

2010-07-26 Thread Kevin Ar18

Brief Summary:
Can I share Python objects between multiple processes using shared memory 
(using the multiprocessing module)?
In particular, do Queues or Pipes work using shared memory?

Details:
* I have several processes each on a separate CPU core (so they run in 
parallel).
* I want to share certain Python objects between the processes using shared 
memory (I can handle the locking myself -- only one process can access a 
variable or object at a time, so there is no conflict anyways).
* I strongly prefer that it be shared memory because I do not want to incur the 
costs of having to copy data back and forth between the processes.

I am aware of the multiprocessing module.  It offers:
Queues
Pipes
Shared Memory Map
Server Process

So, let me ask a few questions about those 4 items:

Queues & Pipes
These handle Python objects and variables  However, my question is do they 
use shared memory or do they require copying the data or some other method that 
is more costly than shared memory?

Shared Memory Map
Does not support Python objects or variables -- thus no good to me.

Server Process
Looks like it requires another process that incurs extra processing just to 
handle the data exchange -- which, if true, is bad for me.
Does this use shared memory?  or does it require copying data back and forth 
between processes?
  
_
The New Busy is not the too busy. Combine all your e-mail accounts with Hotmail.
http://www.windowslive.com/campaign/thenewbusy?tile=multiaccount&ocid=PID28326::T:WLMTAGL:ON:WL:en-US:WM_HMP:042010_4
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] zipfile module can't handle large files

2007-08-29 Thread Kevin Ar18

Create a zip file with a file inside consisting of several GB (say 2 or 5 GB).  
The zipfile module will fail, when attempting to extract the large file.

The issue is near line 490 in zifile.py.  It appears that read (a file 
operation) is unable to read such large amounts of data.  I tried editing 
zipfile.py so that read would read things piece by piece but just got a memory 
error.

Does anyone know how to fix this limitation in the zipfile module?
_
See what you’re getting into…before you go there
http://newlivehotmail.com/?ocid=TXT_TAGHM_migration_HM_viral_preview_0507
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com