On Tue, Sep 8, 2009 at 6:41 PM, Charles R Harris
wrote:
>
> More precisely, 2GB for windows and 3GB for (non-PAE enabled) linux.
And just to further clarify, even with PAE enabled on linux, any
individual process has about a 3 GB address limit (there are hacks to
raise that to 3.5 or 4GB, but wi
Kim Hansen wrote:
>
> On 9-Sep-09, at 4:48 AM, Francesc Alted wrote:
>
> > Yes, this later is supported in PyTables as long as the underlying
> > filesystem
> > supports files > 2 GB, which is very usual in modern operating
> > systems.
>
> I think the OP said he was on Win3
>
> On 9-Sep-09, at 4:48 AM, Francesc Alted wrote:
>
> > Yes, this later is supported in PyTables as long as the underlying
> > filesystem
> > supports files > 2 GB, which is very usual in modern operating
> > systems.
>
> I think the OP said he was on Win32, in which case it should be noted:
> FAT
On 9-Sep-09, at 4:48 AM, Francesc Alted wrote:
> Yes, this later is supported in PyTables as long as the underlying
> filesystem
> supports files > 2 GB, which is very usual in modern operating
> systems.
I think the OP said he was on Win32, in which case it should be noted:
FAT32 has its u
A Wednesday 09 September 2009 10:48:48 Francesc Alted escrigué:
> OTOH, having the possibility to manage compressed data buffers
> transparently in NumPy would help here, but not there yet ;-)
Now that I think about it, in case the data is compressible, Daniel could try
to define a PyTables' comp
A Wednesday 09 September 2009 07:22:33 David Cournapeau escrigué:
> On Wed, Sep 9, 2009 at 2:10 PM, Sebastian Haase wrote:
> > Hi,
> > you can probably use PyTables for this. Even though it's meant to
> > save/load data to/from disk (in HDF5 format) as far as I understand,
> > it can be used to mak
On Wed, Sep 9, 2009 at 2:10 PM, Sebastian Haase wrote:
> Hi,
> you can probably use PyTables for this. Even though it's meant to
> save/load data to/from disk (in HDF5 format) as far as I understand,
> it can be used to make your task solvable - even on a 32bit system !!
> It's free (pytables.org)
Hi,
you can probably use PyTables for this. Even though it's meant to
save/load data to/from disk (in HDF5 format) as far as I understand,
it can be used to make your task solvable - even on a 32bit system !!
It's free (pytables.org) -- so maybe you can try it out and tell me if
I'm right
Or s
Daniel Platz skrev:
> data1 = numpy.zeros((256,200),dtype=int16)
> data2 = numpy.zeros((256,200),dtype=int16)
>
> This works for the first array data1. However, it returns with a
> memory error for array data2. I have read somewhere that there is a
> 2GB limit for numpy arrays on a 32 bit m
On Tue, Sep 8, 2009 at 7:30 PM, Daniel Platz <
mail.to.daniel.pl...@googlemail.com> wrote:
> Hi,
>
> I have a numpy newbie question. I want to store a huge amount of data
> in an array. This data come from a measurement setup and I want to
> write them to disk later since there is nearly no time
On Wed, Sep 9, 2009 at 9:30 AM, Daniel
Platz wrote:
> Hi,
>
> I have a numpy newbie question. I want to store a huge amount of data
> in an array. This data come from a measurement setup and I want to
> write them to disk later since there is nearly no time for this during
> the measurement. To pu
Hi,
I have a numpy newbie question. I want to store a huge amount of data
in an array. This data come from a measurement setup and I want to
write them to disk later since there is nearly no time for this during
the measurement. To put some numbers up: I have 2*256*200 int16
numbers which I w
12 matches
Mail list logo