On Sun, 1 Mar 2009 14:29:54 -0500, Michael Gilbert wrote:
> i will send the current version to the list tomorrow when i have access
> to the system that it is on.
attached is my current version of loadtxt. like i said, it's slower
for small data sets (because it reads through the whole data file
Gideon Simpson wrote:
> So I have some data sets of about 16 floating point numbers stored
> in text files. I find that loadtxt is rather slow. Is this to be
> expected? Would it be faster if it were loading binary data?
Depending on the format you may be able to use numpy.fromfile, whi
On Sun, Mar 1, 2009 at 11:29 AM, Michael Gilbert
wrote:
> On Sun, 1 Mar 2009 16:12:14 -0500 Gideon Simpson wrote:
>
>> So I have some data sets of about 16 floating point numbers stored
>> in text files. I find that loadtxt is rather slow. Is this to be
>> expected? Would it be faster if it
On Sun, 1 Mar 2009 14:29:54 -0500 Michael Gilbert wrote:
> i have rewritten loadtxt to be smarter about allocating memory, but
> it is slower overall and doesn't support all of the original
> arguments/options (yet).
i had meant to say that my version is slower for smaller data sets (when
you ar
On Sun, 1 Mar 2009 16:12:14 -0500 Gideon Simpson wrote:
> So I have some data sets of about 16 floating point numbers stored
> in text files. I find that loadtxt is rather slow. Is this to be
> expected? Would it be faster if it were loading binary data?
i have run into this as well.
On Sun, Mar 1, 2009 at 15:12, Gideon Simpson wrote:
> So I have some data sets of about 16 floating point numbers stored
> in text files. I find that loadtxt is rather slow. Is this to be
> expected?
Probably. You don't say exactly what you mean by "slow", so it's
difficult to tell. But it
So I have some data sets of about 16 floating point numbers stored
in text files. I find that loadtxt is rather slow. Is this to be
expected? Would it be faster if it were loading binary data?
-gideon
___
Numpy-discussion mailing list
Numpy-d