Martin Spacek wrote:
> Would it be better to load the file one
> frame at a time, generating nframes arrays of shape (height, width),
> and sticking them consecutively in a python list?
I just tried this, and it works. Looks like it's all in physical RAM (no
disk thrashing on the 2GB machine),
Christopher Barker wrote:
> Robert Kern wrote:
>> "Official" binaries intended for distribution from scipy.org or scipy.sf.net
>> should not be linked against FFTW or UMFPACK since they are GPLed.
>
> Does that apply to binaries put up on pythonmac.org? It would be nice to
> have a "complete" ve
Robert Kern wrote:
> I would prefer that we use the Python binary from www.python.org. That should
> work on 10.3.9+.
+1 -- there have always been enough issue with Apple;'s python that it's
best to just use another version -- one binary for > 10.3.9 is the way
to go.
> "Official" binaries inte
Kurt Smith wrote:
> You might try numpy.memmap -- others have had success with it for
> large files (32 bit should be able to handle a 1.3 GB file, AFAIK).
Yeah, I looked into numpy.memmap. Two issues with that. I need to
eliminate as much disk access as possible while my app is running. I'm
d
On 11/30/07, Robert Kern <[EMAIL PROTECTED]> wrote:
> Barry Wark wrote:
>
> > Some remaining issues:
> > - which SDK to build against. Leopard ships with a Python build
> > against the 10.5 SDK. It would be much easier, at least initially, for
> > us to produce binaries against the Leopard Python 2
Barry Wark wrote:
> Some remaining issues:
> - which SDK to build against. Leopard ships with a Python build
> against the 10.5 SDK. It would be much easier, at least initially, for
> us to produce binaries against the Leopard Python 2.5.
I would prefer that we use the Python binary from www.pyth
>
> Well, one thing you could do is dump your data into a PyTables_
> ``CArray`` dataset, which you may afterwards access as if its was a
> NumPy array to get slices which are actually NumPy arrays. PyTables
> datasets have no problem in working with datasets exceeding memory size.
> For instanc
I was misinformed about the status of numdisplay's pages. The package
is available as both part of stsci_python and independently, and its
(up-to-date) home page is here:
http://stsdas.stsci.edu/numdisplay/
Googling numdisplay finds that page.
My apologies to those inconvenienced by my prior po
Martin Spacek (el 2007-11-30 a les 00:47:41 -0800) va dir::
>[...]
> I find that if I load the file in two pieces into two arrays, say 1GB
> and 0.3GB respectively, I can avoid the memory error. So it seems that
> it's not that windows can't allocate the memory, just that it can't
> allocate enoug
On Nov 30, 2007 2:47 AM, Martin Spacek <[EMAIL PROTECTED]> wrote:
> I need to load a 1.3GB binary file entirely into a single numpy.uint8
> array. I've been using numpy.fromfile(), but for files > 1.2GB on my
> win32 machine, I get a memory error. Actually, since I have several
> other python modul
On 11/29/07, David Cournapeau <[EMAIL PROTECTED]> wrote:
> Barry Wark wrote:
> > Using the gfortran from http://r.research.att.com/tools/, it's trivial
> > to build a universal build from source. The instructions on scipy.org
> > won't lead you astray.
> >
> > I will ask around at work. Perhaps we
The numpy array is created using PyArray_SimpleNewFromData(). From
the Guide to NumPy, "[y]ou should ensure that the provided memory is
not freed while the returned array is in existence." That is, numpy
does not try to deallocate the memory when the ndarray object is
destroyed, but it al
Even just a build of the last stable version will do it. Most people
(especially those who don't want to go through the hassle of compiling)
are going to be perfectly happy with a binary of the latest release.
Thanks!
Barry Wark wrote:
> Using the gfortran from http://r.research.att.com/tools/,
On Nov 30, 2007 5:16 AM, Bill Spotz <[EMAIL PROTECTED]> wrote:
> I have just committed the latest version of numpy.i (a swig interface
> file for bridging between C arrays and numerical python) to the numpy
> svn repository. There are three relatively new features that are now
> supported:
>
> * I
On Fri, 30 Nov 2007 09:48:09 +0100
Robert Cimrman <[EMAIL PROTECTED]> wrote:
> Nils Wagner wrote:
>> Thank you for your note. It works fine for me with
>> python2.5. However python2.3 results in
>>
>> ./gendocs.py -m 'scipy.linsolve.umfpack'
>> Traceback (most recent call last):
>>File "./g
Nils Wagner wrote:
> Thank you for your note. It works fine for me with
> python2.5. However python2.3 results in
>
> ./gendocs.py -m 'scipy.linsolve.umfpack'
> Traceback (most recent call last):
>File "./gendocs.py", line 261, in ?
> main()
>File "./gendocs.py", line 207, in main
>
I need to load a 1.3GB binary file entirely into a single numpy.uint8
array. I've been using numpy.fromfile(), but for files > 1.2GB on my
win32 machine, I get a memory error. Actually, since I have several
other python modules imported at the same time, including pygame, I get
a "pygame parachute"
17 matches
Mail list logo