Thanks for the references to these libraries - they seem to fix my problem!
Cheers,
Simon
On Thu, Jun 17, 2010 at 2:58 PM, davide wrote:
> You may have a look to the nice python-h5py module, which gives an OO
> interface to the underlying hdf5 file format. I'm using it for storing
> large amou
You may have a look to the nice python-h5py module, which gives an OO
interface to the underlying hdf5 file format. I'm using it for storing
large amounts (~10Gb) of experimental data. Very fast, very convenient.
Ciao
Davide
On Thu, 2010-06-17 at 08:33 -0400, greg whittier wrote:
> On Thu, Jun 1
On Thu, Jun 17, 2010 at 4:21 AM, Simon Lyngby Kokkendorff
wrote:
> memory errors. Is there a way to get numpy to do what I want, using an
> internal platform independent numpy-format like .npy, or do I have to wrap a
> custom file reader with something like ctypes?
You might give http://www.pytab
Hi list,
I am new to this list, so forgive me if this is a trivial problem,
however i would appreciate any help.
I am using numpy to work with large amounts of data - sometimes too much
to fit into memory. Therefore I want to be able to store data in binary
files and use numpy to read chunks