Re: [Numpy-discussion] fast numpy i/o

2011-06-21 Thread Simon Lyngby Kokkendorff
Hi, I have been using h5py a lot (both on windows and Mac OSX) and can only recommend it- haven't tried the other options though Cheers, Simon On Tue, Jun 21, 2011 at 8:24 PM, Derek Homeier < de...@astro.physik.uni-goettingen.de> wrote: > On 21.06.2011, at 7:58PM, Neal Becker wrote: >

Re: [Numpy-discussion] What Requires C and what is just python

2011-03-21 Thread Simon Lyngby Kokkendorff
Hi Ben, It's very easy to package numpy (and most other modules) with py2exe, which like Dan mentioned above, will include all necessary (also non-python) libraries into a dist-folder. The folder to distribute can of course get quite large if you include a lot of libraries - but I think that onl

Re: [Numpy-discussion] How to limit the numpy.memmap's RAM usage?

2010-10-25 Thread Simon Lyngby Kokkendorff
Hi List, I had similar problems on windows. I tried to use memmaps to buffer a large amount of data and process it in chunks. But I found that whenever I tried to do this, I always ended filling up RAM completely which led to crashes of my python script with a MemoryError. This led me to conside

Re: [Numpy-discussion] Accessing data in a large file

2010-06-17 Thread Simon Lyngby Kokkendorff
ing > large amounts (~10Gb) of experimental data. Very fast, very convenient. > > Ciao > > Davide > > On Thu, 2010-06-17 at 08:33 -0400, greg whittier wrote: > > On Thu, Jun 17, 2010 at 4:21 AM, Simon Lyngby Kokkendorff > > wrote: > > > memory errors. Is

[Numpy-discussion] Accessing data in a large file

2010-06-17 Thread Simon Lyngby Kokkendorff
Hi list, I am new to this list, so forgive me if this is a trivial problem, however i would appreciate any help. I am using numpy to work with large amounts of data - sometimes too much to fit into memory. Therefore I want to be able to store data in binary files and use numpy to read chunks