Re: [Numpy-discussion] saving incrementally numpy arrays

2009-08-12 Thread Juan Fiol
Thanks David, I'll look into it now. Regarding the allocation/deallocation times I think that is not an issue for me. The chunks are generated by a fortran routine that takes several minutes to run (I am collecting a few thousand points before saving to disk). They are approximately the same siz

Re: [Numpy-discussion] saving incrementally numpy arrays

2009-08-12 Thread David Warde-Farley
On 12-Aug-09, at 7:11 PM, Juan Fiol wrote: > Hi, I finally decided by the pytables approach because will be > easier later to work with the data. Now, I know is not the right > place but may be I can get some quick pointers. I've calculated a > numpy array of about 20 columns and a few thous

Re: [Numpy-discussion] saving incrementally numpy arrays

2009-08-12 Thread Juan Fiol
> From: Citi, Luca > Subject: Re: [Numpy-discussion] saving incrementally numpy arrays > To: "Discussion of Numerical Python" > Date: Tuesday, August 11, 2009, 9:26 PM > You can do something a bit tricky but > possibly working. > I made the assumption of a C-ordere

Re: [Numpy-discussion] saving incrementally numpy arrays

2009-08-11 Thread Citi, Luca
You can do something a bit tricky but possibly working. I made the assumption of a C-ordered 1d vector. import numpy as np import numpy.lib.format as fmt # example of chunks chunks = [np.arange(l) for l in range(5,10)] # at the beginning fp = open('myfile.npy', 'wb') d = dict( desc

Re: [Numpy-discussion] saving incrementally numpy arrays

2009-08-11 Thread Juan Fiol
= np.load(fi) y1 = np.load(fi) fi.close() #- --- On Tue, 8/11/09, Juan Fiol wrote: > From: Juan Fiol > Subject: Re: [Numpy-discussion] saving incrementally numpy arrays > To: "Discussion of Numerical Python" > Date: Tuesday, August 11, 2009, 8:28 PM > Hi, th

Re: [Numpy-discussion] saving incrementally numpy arrays

2009-08-11 Thread Juan Fiol
les. Thanks and Best regards, Juan --- On Tue, 8/11/09, Keith Goodman wrote: > From: Keith Goodman > Subject: Re: [Numpy-discussion] saving incrementally numpy arrays > To: "Discussion of Numerical Python" > Date: Tuesday, August 11, 2009, 7:46 PM > On Tue, Aug 11, 2009 at

Re: [Numpy-discussion] saving incrementally numpy arrays

2009-08-11 Thread Keith Goodman
On Tue, Aug 11, 2009 at 11:05 AM, Robert Kern wrote: > On Mon, Aug 10, 2009 at 22:29, Juan Fiol wrote: >> Hi, I am creating numpy arrays in chunks and I want to save the chunks while >> my program creates them. I tried to use numpy.save but it failed (because it >> is not intended to append data)

Re: [Numpy-discussion] saving incrementally numpy arrays

2009-08-11 Thread Robert Kern
On Mon, Aug 10, 2009 at 22:29, Juan Fiol wrote: > Hi, I am creating numpy arrays in chunks and I want to save the chunks while > my program creates them. I tried to use numpy.save but it failed (because it > is not intended to append data). I'd like to know what is, in your opinion, > the best w

Re: [Numpy-discussion] saving incrementally numpy arrays

2009-08-10 Thread Kim Hansen
I have had some resembling challenges in my work, and here appending the nympy arrays to HDF5 files using PyTables has been the solution for me - that used in combination with lzo compression/decompression has lead to very high read/write performance in my application with low memory consumption. Y

Re: [Numpy-discussion] saving incrementally numpy arrays

2009-08-10 Thread David Warde-Farley
On 10-Aug-09, at 11:29 PM, Juan Fiol wrote: > Hi, I am creating numpy arrays in chunks and I want to save the > chunks while my program creates them. I tried to use numpy.save but > it failed (because it is not intended to append data). I'd like to > know what is, in your opinion, the best

[Numpy-discussion] saving incrementally numpy arrays

2009-08-10 Thread Juan Fiol
Hi, I am creating numpy arrays in chunks and I want to save the chunks while my program creates them. I tried to use numpy.save but it failed (because it is not intended to append data). I'd like to know what is, in your opinion, the best way to go. I will put a few thousands every time but buil