Visual Studio C, but this give undefined symbols from 'libnpymath.a'
> like this:
>
This is not really supported. You should avoid mixing compilers when
building C extensions using numpy C API. Either all mingw, or all MSVC.
David
>
> npymath.lib(npy_math.o) : error LNK2
://github.com/numpy/numpy/blob/master/numpy/core/setup.py#L638
David
On Sat, Jan 31, 2015 at 9:53 PM, Sebastien Gouezel <
sebastien.goue...@univ-rennes1.fr> wrote:
> Dear all,
>
> I tried to use numpy (version 1.9.1, installed by `pip install numpy`)
> on cygwin64. I encount
copy?
Thanks,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
Sebastian Berg sipsolutions.net> writes:
>
> Python has a mechanism both for getting an item and for setting an item.
> The latter will end up doing this (python already does this for us):
> x[:,d,:,d] = x[:,d,:,d] + 1
> so there is an item assignment going on (__setitem__ not __getitem__)
>
> -
I'll be there as well, though I am still figuring out when exactly .
On Wed, Feb 18, 2015 at 1:07 AM, Nathaniel Smith wrote:
> Hi all,
>
> It looks like I'll be at PyCon this year. Anyone else? Any interest in
> organizing a numpy sprint?
>
> -n
>
> --
> Nathaniel J. Smith -- http://vorpus.org
>
>
I concur on that: For the 350+ packages we support at Enthought, cmake has
been a higher pain point than any other build tool (that is including
custom ones). And we only support mainstream platforms.
But the real question for me is what does visual studio support mean ? Does
it really mean solution files ?
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
on how much it lets us simplify things, I guess. Would we get to
> remove all the no-export attributes on everything?
>
No, the whole point of the no-export is to support the separate compilation
use case.
David
> On Apr 3, 2015 8:01 PM, "Charles R Harris"
> wrote:
>
>
A is C-style and B fortran-style.
>
Does your implementation use BLAS, or is just a a wrapper around einsum ?
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
IMO, this really begs the question on whether we still want to use
sourceforge at all. At this point I just don't trust the service at all
anymore.
Could we use some resources (e.g. rackspace ?) to host those files ? Do we
know how much traffic they get so estimate the cost ?
David
On Thu
In any case I've always been surprised that NumPy is distributed
> through SourceForge, which has been sketchy for years now. Could it
> simply be hosted on PyPI?
>
They don't accept arbitrary binaries like SF does, and some of our
installer formats can't be uploade
Sorry if that's obvious, but do you have Visual Studio 2010 installed ?
On Thu, Aug 6, 2015 at 11:17 PM, Charles R Harris wrote:
> Anyone know how to fix this? I've run into it before and never got it
> figured out.
>
> [192.168.121.189:22] out: File
> "C:\Python34\lib\distutils\msvc9compiler.
Charles R Harris <
> charlesr.har...@gmail.com> wrote:
>
>>
>>
>> On Thu, Aug 6, 2015 at 4:22 PM, David Cournapeau
>> wrote:
>>
>>> Sorry if that's obvious, but do you have Visual Studio 2010 installed ?
>>>
>>> On Th
end ?
David
On Tue, Aug 18, 2015 at 9:07 PM, Charles R Harris wrote:
> Hi All,
> .
> I'm bringing up this topic again on account of the discussion at
> https://github.com/numpy/numpy/pull/6199. The proposal is to stop
> (trying) to support the Bento build system for Numpy
On Wed, Aug 19, 2015 at 1:22 AM, Nathaniel Smith wrote:
> On Tue, Aug 18, 2015 at 4:15 PM, David Cournapeau
> wrote:
> > If everybody wants to remove bento, we should remove it.
>
> FWIW, I don't really have an opinion either way on bento versus
> distutils, I j
team to
improve this ? Or was that considered acceptable with current cython for
numpy. I am convinced cleanly separating the low level parts from the
python C API plumbing would be the single most important thing one could do
to make the codebase more amenable.
David
On Tue, Aug 25, 2015 at 9
e are < 60 methods in the table, and most of them should be fairly
straightforward to cythonize. At worse, we could just keep them as is
outside cython and just "export" them in cython.
Does that sound like an acceptable plan ?
If so, I will star
On Tue, Sep 1, 2015 at 8:16 AM, Nathaniel Smith wrote:
> On Sun, Aug 30, 2015 at 2:44 PM, David Cournapeau
> wrote:
> > Hi there,
> >
> > Reading Nathaniel summary from the numpy dev meeting, it looks like
> there is
> > a consensus on using cython in
e sources for a given tag is
confusing.
David
>
> Cheers,
>
> Matthew
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
_
a python with all libraries static linked. Here is the
environment:
iOS 8.0+
Python 3.4
PyQt 5.5
Qt 5.5
pyqtdeploy
Any help getting NumPy compiled into the iOS app?
Thank you,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https
iculable, concrete points
> of concern? Otherwise this just seems like idle, non-constructive
> speculation (at best).
>
There is ample history of such things happening in OSS history, so I think
that's a fair concern, even if that has not happened for numpy yet.
David
__
onsider along the way is separating numpy.multiarray
> and friends into an actual library plus a module. That way the new numpy
> api would be exposed in the library rather than by importing an array of
> pointers from the module.
>
Agreed.
This would help the cythonizing process a
cannot fully, nor do you
> have to, I will let it stand as is after this and let others take over
> from here (after this, probably whatever Chuck says is good). [1]
>
> More to the point of the actual members:
>
> So to say, I feel the council members have to try to be *directly
ommon functionalities (aka the
'core' library) into its own static library. The API would be considered
private to numpy (no stability guaranteed outside numpy), and every
exported symbol from that library would be decorated appropriately to avoid
potential clashes (e.g. '_npy_i
On Tue, Oct 6, 2015 at 12:07 PM, Antoine Pitrou wrote:
> On Tue, 6 Oct 2015 11:00:30 +0100
> David Cournapeau wrote:
> >
> > Assuming one of the rumour is related to some comments I made some time
> > (years ?) earlier, the context was the ability to hide exported sy
On Tue, Oct 6, 2015 at 5:44 PM, Nathaniel Smith wrote:
> On Tue, Oct 6, 2015 at 4:46 AM, David Cournapeau
> wrote:
> > The npy_ functions in npymath were designed to be exported. Those would
> stay
> > that way.
>
> If we want to export these then I vote that we ei
On Tue, Oct 6, 2015 at 5:51 PM, David Cournapeau wrote:
>
>
> On Tue, Oct 6, 2015 at 5:44 PM, Nathaniel Smith wrote:
>
>> On Tue, Oct 6, 2015 at 4:46 AM, David Cournapeau
>> wrote:
>> > The npy_ functions in npymath were designed to be exported. Those would
&
On Tue, Oct 6, 2015 at 5:58 PM, Nathaniel Smith wrote:
> On Tue, Oct 6, 2015 at 9:51 AM, David Cournapeau
> wrote:
> >
> > On Tue, Oct 6, 2015 at 5:44 PM, Nathaniel Smith wrote:
> >>
> >> On Tue, Oct 6, 2015 at 4:46 AM, David Cournapeau
> >> wro
has worked
fairly well and has been used in at least scipy since the 1.4/1.5 days IIRC
(including windows).
David
>
> > And, of course, we would also benefit from the CBLAS functions (or any
> > kind of C wrappers around them) :-)
> > https://github.com/numpy/numpy/issues/63
On Tue, Oct 6, 2015 at 6:14 PM, Nathaniel Smith wrote:
> On Tue, Oct 6, 2015 at 10:10 AM, David Cournapeau
> wrote:
> >
> >
> > On Tue, Oct 6, 2015 at 6:07 PM, Nathaniel Smith wrote:
> >>
> >> On Tue, Oct 6, 2015 at 10:00 AM, Antoine Pitrou
> >&
On Tue, Oct 6, 2015 at 6:18 PM, David Cournapeau wrote:
>
>
> On Tue, Oct 6, 2015 at 6:14 PM, Nathaniel Smith wrote:
>
>> On Tue, Oct 6, 2015 at 10:10 AM, David Cournapeau
>> wrote:
>> >
>> >
>> > On Tue, Oct 6, 2015 at 6:07 PM, Nathaniel Smi
On Tue, Oct 6, 2015 at 7:30 PM, Nathaniel Smith wrote:
> [splitting this off into a new thread]
>
> On Tue, Oct 6, 2015 at 3:00 AM, David Cournapeau
> wrote:
> [...]
> > I also agree the current situation is not sustainable -- as we discussed
> > privately before, cyt
On Tue, Oct 6, 2015 at 8:04 PM, Nathaniel Smith wrote:
> On Tue, Oct 6, 2015 at 11:52 AM, David Cournapeau
> wrote:
> >
> >
> > On Tue, Oct 6, 2015 at 7:30 PM, Nathaniel Smith wrote:
> >>
> >> [splitting this off into a new thread]
> >>
&g
On Thu, Oct 8, 2015 at 8:47 PM, Nathaniel Smith wrote:
> On Oct 8, 2015 06:30, "David Cournapeau" wrote:
> >
> [...]
> >
> > Separating the pure C code into static lib is the simple way of
> achieving the same goal. Essentially, you write:
&
ds, but for us at Enthought,
windows 32 bits is in the same ballpark as OS X and Linux (64 bits) in
terms of proportion, windows 64 bits being significantly more popular.
Linux 32 bits and OS X 32 bits have been in the 1 % range each of our
downloads for a while (we recently stopped support for bo
s. You will need a way to install things when
building a conda package in any case
David
> Is there any particular reason for not using it?
>
> On Tue, Oct 27, 2015 at 11:48 AM, James E.H. Turner
> wrote:
>
>> Apparently it is not well known that if you have a Python project
king into structured arrays. In case it is relevant: Are you
using certain 1.10? They are apparently a LOT slower than 1.9.3, an issue
which will be fixed in a future version.
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion
f90wrap [1] extends the functionality of f2py, and can automatically
generate sensible wrappers for certain cases.
[1] https://github.com/jameskermode/f90wrap
On 15 July 2015 at 03:45, Sturla Molden wrote:
> Eric Firing wrote:
>
> > I'm curious: has anyone been looking into what it would take t
pywafo/blob/pipinstall/setup.py
[3] http://docs.scipy.org/doc/numpy/reference/distutils.html
Regards,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion
I would be in favour of dropping 3.3, but not 2.6 until it becomes too
cumbersome to support.
As a data point, as of april, 2.6 was more downloaded than all python 3.X
versions together when looking at pypi numbers:
https://caremad.io/2015/04/a-year-of-pypi-downloads/
David
On Thu, Dec 3, 2015
On Fri, Dec 4, 2015 at 11:06 AM, Nathaniel Smith wrote:
> On Fri, Dec 4, 2015 at 1:27 AM, David Cournapeau
> wrote:
> > I would be in favour of dropping 3.3, but not 2.6 until it becomes too
> > cumbersome to support.
> >
> > As a data point, as of april, 2.6 was m
Thanks a lot for providing the example Sturla, that is exactly what we are
looking for!
On 4 December 2015 at 11:34, Sturla Molden wrote:
> On 03/12/15 22:07, David Verelst wrote:
>
> Can this workflow be incorporated into |setuptools|/|numpy.distutils|?
>> Something alo
that may has
been fixed since then.
David
> Anne
>
> On Fri, Dec 11, 2015, 16:46 Charles R Harris
> wrote:
>
>> On Fri, Dec 11, 2015 at 6:25 AM, Thomas Baruchel
>> wrote:
>>
>>> From time to time it is asked on forums how to extend precision of
>>>
how many people are running 32 bit Python on Windows these
> days??
>
I don't claim we are representative of the whole community, but as far as
canopy is concerned, it is still a significant platform. That's the only 32
bit platform we still support (both linu
ubt are directly
> used by the packages.
>
It is also a common problem when building packages without using a "clean"
build environment, as it is too easy to pick up dependencies accidentally,
especially for autotools-based packages (unless one uses pbuilder or
similar tools).
David
inux systems are now transitioning to C++11 which is binary
> incompatible in parts to the old standard. There a lot of testing is
> necessary to check if we are affected.
> How does Anaconda deal with C++11?
>
For canopy packages, we use the RH devtoolset w/ gcc 4.8.X, and staticall
On Mon, Jan 11, 2016 at 6:25 PM, Chris Barker wrote:
> On Fri, Jan 8, 2016 at 7:13 PM, Nathaniel Smith wrote:
>
>> > that this would potentially be able to let packages like numpy serve
>> their
>> > linux
>> > users better without risking too much junk being uploaded to PyPI.
>>
>> That will ne
uld) do, such as writing the expected
metadata in site-packages (PEP 376). Currently, conda does not recognize
packages installed by pip (because it does not implement PEP 376 and co),
so if you do a "pip install ." of a package, it will likely break existing
package if present.
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion
what's used by distutils to build C extensions.
This is only valid on Unix/cygwin, if you are on windows, the process is
completely different.
David
>
> Thanks for your help,
>
> Christian
>
>
>
> This email and any attachments are intended solely for the use of the
&
gt;> well as just having a talk on Numpy at a PyData conference. In general
>> there are too few (if any) talks on Numpy and other core libraries at
>> PyData and Scipy confs I think.
>>
>
> +1.
>
> It would
ebra.
>
I would not worry too much about this: at worst, this gives us back the
situation where we were w/ so-called superpack, which have been successful
in the past to spread numpy use on windows.
My main worry is whether this locks us into ATLAS for a long time because
of package dependin
problem dumping the array.
I am using NumPy v1.9.3
Any ideas on why this might be happening?
Thank you,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion
o in scipy, but it was a PITA to maintain. Contrary
to blas/lapack, fft does not have a standard API, hence exposing a
consistent API in python, including data layout involved quite a bit of
work.
It is better to expose those through 3rd party APIs.
David
> Sturla
>
>
Which are the best ways to turn a JSON object into a CSV or Pandas data frame
table?
Looking forward to hearing from you.
Regards.
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy
y (you can access numpy arrays directly as C arrays), very python like
syntax and amazing performance.
Good luck,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion
+1 from me.
If we really need some distribution on top of github/pypi, note that
bintray (https://bintray.com/) is free for OSS projects, and is a much
better experience than sourceforge.
David
On Sun, Oct 2, 2016 at 12:02 AM, Charles R Harris wrote:
> Hi All,
>
> Ralf has suggested
binary compatibility with centos 5.X
and above, though I am not sure about the impact on speed.
It has been quite some time already that building numpy/scipy with gcc 4.1
causes troubles with errors and even crashes anyway, so you definitely want
to use a more recent compiler in any case.
David
.
I am a bit suspicious about the whole thing as neither conda's or gholke's
wheel crashed. Has anybody else encountered this ?
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion
Indeed. I wrongly assumed that since gholke's wheels did not crash, they
did not run into that issue.
That sounds like an ABI issue, since I suspect intel math library supports
C99 complex numbers. I will add info on that issue then,
David
On Mon, Jan 23, 2017 at 11:46 AM, Evgeni Bur
;, or will it still significantly change (this requiring a
different, more stable one) ?
cheers,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
Pauli Virtanen wrote:
> ma, 2010-03-29 kello 19:13 +0900, David Cournapeau kirjoitti:
>> I have worked on porting scipy to py3k, and it is mostly working. One
>> thing which would be useful is to install something similar to
>> npy_3kcompat.h in numpy, so that every scipy exte
the -j option). It
would also avoid duplication in scipy.
cheers,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
;
>> Is any able to reproduce this? I don't get 'nan' in either 1.4.0 or
>> 2.0.0.dev8313 (32 bit Mac OSX). In an earlier email T J reported using
>> 1.5.0.dev8106.
Having the config.h as well as the compilation options would be most
useful, to determine which function
logaddexp2(-1.5849625007211563,
>> -53.584962500721154)
>>>>>>>>>>> Out[1]: -1.5849625007211561
>>>>>>>>>>>
>>>>>>>>>>> In [2]: np.logaddexp2(-0.5849625007211563,
>> -53.584962500721154)
>>>>>>
hines, and we can try compiler gymnastics.
Yes, we could do that. I note that on glibc, the function called is an
intrinsic for log1p (FYL2XP1) if x is sufficiently small.
> Clearly the optimizing compiler is inserting the DRDB (drain David's
> battery) opcode.
:)
David
__
Anne Archibald wrote:
> On 1 April 2010 03:15, David Cournapeau wrote:
>> Anne Archibald wrote:
>>
>>> Particularly given the comments in the boost source code, I'm leery of
>>> this fix; who knows what an optimizing compiler will do with it?
>> But
>>> np.version.version
'1.4.0'
>>> c = np.polynomial.chebyshev.Chebyshev(1)
>>> c.deriv(1.0)
Chebyshev([ 0.], [-1., 1.])
>>> c.integ(1.0)
Traceback (most recent call last):
File "", line 1, in
File "", line 441, in integ
File "C:\Python26\lib\site-packages\numpy\polynomial\chebyshev.py", li
n implementation,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
On Thu, Apr 1, 2010 at 6:42 PM, David Goldsmith wrote:
> >>> np.version.version
> '1.4.0'
> >>> c = np.polynomial.chebyshev.Chebyshev(1)
> >>> c.deriv(1.0)
> Chebyshev([ 0.], [-1., 1.])
> >>> c.integ(1.0)
> Traceback (most re
On Fri, Apr 2, 2010 at 10:42 AM, Charles R Harris wrote:
>
> On Thu, Apr 1, 2010 at 7:42 PM, David Goldsmith
> wrote:
>
>> >>> np.version.version
>> '1.4.0'
>> >>> c = np.polynomial.chebyshev.Chebyshev(1)
>> >>> c.deriv(1.
On Fri, Apr 2, 2010 at 10:46 AM, Charles R Harris wrote:
> On Fri, Apr 2, 2010 at 11:27 AM, David Goldsmith
> wrote:
>
>> Also:
>>
>> >>> c.deriv(0)
>> Chebyshev([ 1.], [-1., 1.])
>> >>> c.integ(0)
>>
>> Traceback (most re
d.
I understand defmatrix was moved from core to matrixlib? Is there some
workaround
I could use? I might have to move my data in between machines with either
versions of
numpy installed in the future as well... I already tried some renaming
tricks but to
no avail.
Thanks
David
The University of Edin
On Mon, Apr 5, 2010 at 8:40 AM, Charles R Harris
wrote:
> Hi All,
>
> David Cournapeau has mentioned that he would like to have a numpy math
> library that would supply missing functions and I'm wondering how we should
> organise the source code. Should we put a mathlib direc
sn't enough.
Cheers
David
On Sat, Apr 3, 2010 at 8:13 PM, David Reichert wrote:
> Hi,
>
> After some work I got an optimized numpy compiled on a machine where I
> don't
> have root access, but I had to use numpy 1.4.0 to make it work. Now I have
> the problem that I
>> >>
>> >> On Mon, Apr 5, 2010 at 10:56, Charles R Harris
>> >> wrote:
>> >> >
>> >> >
>> >> > On Mon, Apr 5, 2010 at 9:43 AM, Robert Kern
>> >> > wrote:
>> >> >>
>> >&
k it is much more worthwhile to think about reorganizing the
rest of numpy.core C code, the npymath library is very low hanging
fruit in comparison, if only by size.
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
ul (a
backtrace from gdb much more),
cheers,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
numpy or python bug.
> Is it the problem with libc library?
Very unlikely, this looks like a ref count bug,
cheers,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
umPy library code and the
> Python-specific interface to it? What other re-organization thoughts
> are you having David?
This is mainly it, reorganizing the code for clearer boundaries
between boilerplate (python C API) and actual compuational code.
Besides helping other python implementation
the problem. So I would prefer to see what happens with the same numpy
version built against the Red Hat python (2.6.1) before looking into
numpy proper. Given how simple your example is, it is quite unlikely
that there is a ref count bug that nobody encountered
x 1 output
I need to do this so I can iteratively build a matrix by adding new
columns. The problem is that sparse matrix constructors don't seem
expect "0" as input for a dimension.
Thank you,
/David
___
NumPy-Discussion mailing list
N
0.0.3 is hook support - the goal is to have an API robust enough
so that future toydist code will be based on hooks, as well a basic support for
waf or scons-based C builds.
cheers,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http
unc support, although I am not sure how to plug this with
the C math library for math functions (cos, log, etc...)
Especially for indexing and broadcasting, even if your project
"fails", having a pure, reference python implementation would be
tremendously useful - in partic
this, please compare gcc -v against the version
reported by python (for example at the python prompt).
cheers,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
mg script, but I would like
to know what became so big so that numpy is now > 10 Mb.
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
omes with issues which are difficult to track down
(stalled files for entry points which are not removed by uninstall -u,
etc...). Those are non issues for the experienced users, but a pain in
my experience for beginners.
The easy and reliable solution for non root install is PYTHONPATH f
> for each package, and make sure you're careful about running more than
> one version of Python.
The beauty of --user is that you don't need PYTHONPATH, and it is
interpreter specific (at least if the package is correctly done).
PYTHONPATH is becoming a p
Hi Matt,
I don't think the memmap code support this. However, you can stack memmaps
just as easily as arrays, so if you define individual memmaps for each slice
and stack them (numpy.vstack), the resulting array will behave as a regular
3D array.
HTH,
David H.
On Wed, Apr 21, 2010 at 3:
n stone yet, so comments are welcomed there,
cheers,
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
could see
are: using an enviroment geared toward IEEE 754 compliance (CPU
emulation), simply use one of the existing package to run code on
GPU, or use software-implemented FPU. The latter meaning that you
cannot use linear algebra and so on, at least not with just
numpy/scip
y this is? The
> dialog gave no indication. Is an uninstall log with details generated
> anywhere?
There should be one in C:\Python*, something like numpy-*-wininst.log
> Perhaps it is some shared DLL, but I have no idea which!
The numpy installer does not have any sha
rkaround to share? I really just want to track a
> few arrays in a friendly way from within Python (I am aware of the
> existance of C-level profilers).
I think heapy has some hooks so that you can add support for
extensions. Maybe we could provide a C API in numpy
;import sys; print sys.path"
You can check the path of the package after import:
python -c "import numpy; print numpy.__file__"
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
avoid getting
PyArray_API defined, but I don't understand why.
PY_ARRAY_UNIQUE_SYMBOL should only be used when you want to split your
extension into separately compilation units (object files).
David
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion
On Thu, Apr 29, 2010 at 12:30 PM, Pauli Virtanen wrote:
> Wed, 28 Apr 2010 14:12:07 -0400, Alan G Isaac wrote:
> [clip]
> > Here is a related ticket that proposes a more explicit alternative:
> > adding a ``dot`` method to ndarray.
> > http://projects.scipy.org/numpy/ticket/1456
>
> I kind of lik
On Sun, May 9, 2010 at 4:49 AM, wrote:
> On Sun, May 9, 2010 at 1:01 AM, T J wrote:
> > The docstring for np.pareto says:
> >
> >This is a simplified version of the Generalized Pareto distribution
> >(available in SciPy), with the scale set to one and the location set
> to
> >zero. M
On Mon, May 10, 2010 at 11:14 AM, T J wrote:
> On Sun, May 9, 2010 at 4:49 AM, wrote:
> >
> > I think this is the same point, I was trying to make last year.
> >
> > Instead of renormalizing, my conclusion was the following,
> > (copied from the mailinglist August last year)
> >
> > """
> > my
On Tue, May 11, 2010 at 12:23 AM, T J wrote:
> On Mon, May 10, 2010 at 8:37 PM, wrote:
> >
> > I went googling and found a new interpretation
> >
> > numpy.random.pareto is actually the Lomax distribution also known as
> Pareto 2,
> > Pareto (II) or Pareto Second Kind distribution
> >
>
> Great
Hi Allen,
If you google on "python user input" you already have your answer...
for instance: http://en.wikibooks.org/wiki/Python_Programming/Input_and_output
Hope this helps,
David
> Hi all,
> Am creating a script to do least square adjustment of levelling data. How do
> I
Charles H.: is this happening because he's calling the old version of
polyfit?
William: try using numpy.polynomial.polyfit instead, see if that works.
DG
On Wed, May 19, 2010 at 11:03 AM, William Carithers wrote:
> I'm trying to do a simple 2nd degree polynomial fit to two arrays of 5
> entries
The polynomial module definitely postdates 1.2.1; I echo Josef's rec. that
you update if possible.
On Wed, May 19, 2010 at 1:24 PM, William Carithers wrote:
> Hi Josef,
> > I didn't know numpy will use the scipy version of linalg for this.
>
Right, that's what told me he must be using an old (an
801 - 900 of 3277 matches
Mail list logo