[Numpy-discussion] welcome Andrew Nelson to the NumPy maintainers team

2023-08-21 Thread Ralf Gommers
Hi all,

On behalf of the steering council, I am very happy to announce that Andrew
is joining the Maintainers team. Andrew has been contributing to our CI
setup in particular for the past year, and has contributed for example the
Cirrus CI setup and the musllinux builds:
https://github.com/numpy/numpy/pulls/andyfaff.

Welcome Andrew, I'm looking forward to working with you more!

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: NEP 55 - Add a UTF-8 Variable-Width String DType to NumPy

2023-08-30 Thread Ralf Gommers
On Tue, Aug 29, 2023 at 4:08 PM Nathan  wrote:

> The NEP was merged in draft form, see below.
>
> https://numpy.org/neps/nep-0055-string_dtype.html
>

This is a really nice NEP, thanks Nathan! I see that questions and
constructive feedback is still coming in on GitHub, but for now it seems
like everyone is pretty happy with moving forward with implementing this
new dtype in NumPy.

Cheers,
Rafl




>
> On Mon, Aug 21, 2023 at 2:36 PM Nathan  wrote:
>
>> Hello all,
>>
>> I just opened a pull request to add NEP 55, see
>> https://github.com/numpy/numpy/pull/24483.
>>
>> Per NEP 0, I've copied everything up to the "detailed description"
>> section below.
>>
>> I'm looking forward to your feedback on this.
>>
>> -Nathan Goldbaum
>>
>> =
>> NEP 55 — Add a UTF-8 Variable-Width String DType to NumPy
>> =
>>
>> :Author: Nathan Goldbaum 
>> :Status: Draft
>> :Type: Standards Track
>> :Created: 2023-06-29
>>
>>
>> Abstract
>> 
>>
>> We propose adding a new string data type to NumPy where each item in the
>> array
>> is an arbitrary length UTF-8 encoded string. This will enable performance,
>> memory usage, and usability improvements for NumPy users, including:
>>
>> * Memory savings for workflows that currently use fixed-width strings and
>> store
>> primarily ASCII data or a mix of short and long strings in a single NumPy
>> array.
>>
>> * Downstream libraries and users will be able to move away from object
>> arrays
>> currently used as a substitute for variable-length string arrays,
>> unlocking
>> performance improvements by avoiding passes over the data outside of
>> NumPy.
>>
>> * A more intuitive user-facing API for working with arrays of Python
>> strings,
>> without a need to think about the in-memory array representation.
>>
>> Motivation and Scope
>> 
>>
>> First, we will describe how the current state of support for string or
>> string-like data in NumPy arose. Next, we will summarize the last major
>> previous
>> discussion about this topic. Finally, we will describe the scope of the
>> proposed
>> changes to NumPy as well as changes that are explicitly out of scope of
>> this
>> proposal.
>>
>> History of String Support in Numpy
>> **
>>
>> Support in NumPy for textual data evolved organically in response to
>> early user
>> needs and then changes in the Python ecosystem.
>>
>> Support for strings was added to numpy to support users of the NumArray
>> ``chararray`` type. Remnants of this are still visible in the NumPy API:
>> string-related functionality lives in ``np.char``, to support the
>> obsolete
>> ``np.char.chararray`` class, deprecated since NumPy 1.4 in favor of
>> string
>> DTypes.
>>
>> NumPy's ``bytes_`` DType was originally used to represent the Python 2 ``
>> str``
>> type before Python 3 support was added to NumPy. The bytes DType makes
>> the most
>> sense when it is used to represent Python 2 strings or other
>> null-terminated
>> byte sequences. However, ignoring data after the first null character
>> means the
>> ``bytes_`` DType is only suitable for bytestreams that do not contain
>> nulls, so
>> it is a poor match for generic bytestreams.
>>
>> The ``unicode`` DType was added to support the Python 2 ``unicode``
>> type. It
>> stores data in 32-bit UCS-4 codepoints (e.g. a UTF-32 encoding), which
>> makes for
>> a straightforward implementation, but is inefficient for storing text
>> that can
>> be represented well using a one-byte ASCII or Latin-1 encoding. This was
>> not a
>> problem in Python 2, where ASCII or mostly-ASCII text could use the
>> Python 2
>> ``str`` DType (the current ``bytes_`` DType).
>>
>> With the arrival of Python 3 support in NumPy, the string DTypes were
>> largely
>> left alone due to backward compatibility concerns, although the unicode
>> DType
>> became the default DType for ``str`` data and the old ``string`` DType
>> was
>> renamed the ``bytes_`` DType. This change left NumPy with the sub-optimal
>> situation of shipping a data type originally intended for null-terminated
>> bytestrings as the data type for *all* python ``bytes`` data, and a
>> default
>> string type with an in-memory representation that consumes four times as
>> much
>> memory as needed for ASCII or mostly-ASCII data.
>>
>> Problems with Fixed-Width Strings
>> *
>>
>> Both existing string DTypes represent fixed-width sequences, allowing
>> storage of
>> the string data in the array buffer. This avoids adding out-of-band
>> storage to
>> NumPy, however, it makes for an awkward user interface. In particular, the
>> maximum string size must be inferred by NumPy or estimated by the user
>> before
>> loading the data into a NumPy array or selecting an output DType for
>> string
>> operations. In the worst case, this requires an expensive pass over the
>> full
>> dataset to calculate the

[Numpy-discussion] Re: reminder: put an upper bound on setuptools if you use numpy.distutils!

2023-08-31 Thread Ralf Gommers
On Thu, Aug 31, 2023 at 6:39 PM Kevin Sheppard 
wrote:

>
>
> On Sun, Aug 21, 2022 at 6:36 PM Ralf Gommers 
> wrote:
>
>> Hi all,
>>
>> After setuptools 65.0 was released a few days ago, all users of
>> numpy.distutils had their builds broken. This is already fixed in
>> setuptools 65.0.2 because the breakage was particularly bad. However, the
>> next breakage may not be fixed anymore (and more breakages *are* expected).
>> So this is a good time to remind you all that you should put an upper bound
>> on the setuptools version you allow in the releases of your package - to
>> the last version that is known to work with your package.
>>
>> Our official stance here is that setuptools versions >=60 are not
>> supported - see the "deprecating numpy.distutils" thread:
>> https://mail.python.org/archives/list/numpy-discussion@python.org/message/PMU4P4YRP2FZA2Z6Z6Z74ZFYD6PCRXQ5/.
>> Newer versions may work for you, depending on what features you use. They
>> don't for NumPy and for SciPy; both projects pin to 59.X to avoid problems.
>>
>> For the recent issue with setuptools 65.0.0, see
>> https://github.com/numpy/numpy/issues/22135. We have also made the
>> warnings about this topic in our docs more explicit, see
>> https://github.com/numpy/numpy/pull/22154.
>>
>> Cheers,
>> Ralf
>>
>>
>> ___
>> NumPy-Discussion mailing list -- numpy-discussion@python.org
>> To unsubscribe send an email to numpy-discussion-le...@python.org
>> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
>> Member address: kevin.k.shepp...@gmail.com
>
>
> Is there a good way to pin <2 this when using oldest-supported-numpy?
>

oldest-supported-numpy gives you build-time `==` constraints. So you
already have fixed (<2.0) versions there. What you are asking about is
related to runtime upper bounds I think, not build-time? The way to do that
is to use

  dependencies = ["numpy<2.0"]  # or: "numpy>=1.22.4,<2.0"

in your pyproject.toml.

This question is unrelated to an upper bound on setuptools, right?

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: [SciPy-Dev] Re: NumPy 1.26.0rc1 released

2023-09-07 Thread Ralf Gommers
On Thu, Sep 7, 2023 at 1:08 AM Charles R Harris 
wrote:

>
>
> On Wed, Sep 6, 2023 at 4:53 PM Charles R Harris 
> wrote:
>
>> Hi All,
>>
>> On behalf of the NumPy team, I'm pleased to announce the release of NumPy
>> 1.26.0rc1. The NumPy 1.26.0 release is a continuation of the 1.25.x release
>> cycle with the addition of Python 3.12.0 support. Python 3.12 dropped
>> distutils, consequently supporting it required finding a replacement for
>> the setup.py/distutils based build system NumPy was using. We have
>> chosen to use the Meson build system instead, and this is the first NumPy
>> release supporting it. This is also the first release that supports Cython
>> 3.0 in addition to retaining 0.29.X compatibility. Supporting those two
>> upgrades was a large project, over 100 files have been touched in this
>> release. The changelog doesn't capture the full extent of the work, special
>> thanks to Ralf Gommers, Sayed Adel, Stéfan van der Walt, and Matti Picus
>> who did much of the work in the main development branch.
>>
>> 
>

Thanks for doing the release Chuck!


> Two observations:
>
> 1. There are still no 32 bit wheels for Windows.
>

This shouldn't actually be difficult to do. We already have a have a
regular CI job that shows how to configure MSVC for 32-bit builds:
https://github.com/numpy/numpy/blob/main/.github/workflows/windows.yml#L88.
The only thing that doesn't do is pull in OpenBLAS, which is always fiddly.
That's why I haven't gotten around to dealing with this yet. I'll have a
look at updating the wheel builds so we have them for the final 1.26.0
release.


> 2., Do we want to build wheels with the new accelerate?
>

No, we unfortunately can't do that. The `packaging` library, and hence
`pip`, is unable to detect minor macOS versions, so we can't build wheels
for macOS >=13.3 (the first version with the new Accelerate) that will
install correctly.

It'd be a large change to make anyway, the plan is to do this in NumPy 2.0
for macOS >=14.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Proposal to accept NEP 52 (Python API cleanup for 2.0)

2023-09-15 Thread Ralf Gommers
Hi all,

A lot of work has been happening to implement NEP 52 (
https://numpy.org/neps/nep-0052-python-api-cleanup.html) over the past 1.5
months - mostly work by Mateusz Sokol, and review effort of Sebastian,
Nathan and myself. The majority of API changes have been made. There's more
to do of course and there are pending PRs for a good fraction of that.
These two tracking issues cover a lot of ground and discussion around
decision on individual APIs:

- main namespace: https://github.com/numpy/numpy/issues/24306
- numpy.lib namespace: https://github.com/numpy/numpy/issues/24507

This PR with a migration guide will give a good sense of what has been
removed or changed so far: https://github.com/numpy/numpy/pull/24693.

In https://github.com/numpy/numpy/pull/24620 the NEP itself is being
updated for changes that have been made. And it will mark the NEP as
Accepted, which seems about time given that a lot of the work has already
been merged.

If there are no substantive objections within 7 days from this email, then
the NEP will be accepted; see NEP 0 for more details.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Proposal to accept NEP 52 (Python API cleanup for 2.0)

2023-09-15 Thread Ralf Gommers
On Fri, Sep 15, 2023 at 8:22 PM Dom Grigonis  wrote:

> Hello,
>
> I have a couple of questions:
> 1. What is equivalent of np.byte_bounds? I have recently started using
> this.
>

The migration guide says: Now it's available under ``np.lib.array_utils.
byte_bounds``

2. Why are you removing business day functionality? Are there faster
> methods in python space for it? As far as I remember, for performance
> critical applications I have always resorted to numpy, including its
> business day functionality.
>

This change was abandoned, because it was too much work. That is explained
in the PR that updates the NEP (https://github.com/numpy/numpy/pull/24620).

Cheers,
Ralf



> On 15 Sep 2023, at 21:12, Ralf Gommers  wrote:
>
> Hi all,
>
> A lot of work has been happening to implement NEP 52 (
> https://numpy.org/neps/nep-0052-python-api-cleanup.html) over the past
> 1.5 months - mostly work by Mateusz Sokol, and review effort of Sebastian,
> Nathan and myself. The majority of API changes have been made. There's more
> to do of course and there are pending PRs for a good fraction of that.
> These two tracking issues cover a lot of ground and discussion around
> decision on individual APIs:
>
> - main namespace: https://github.com/numpy/numpy/issues/24306
> - numpy.lib namespace: https://github.com/numpy/numpy/issues/24507
>
> This PR with a migration guide will give a good sense of what has been
> removed or changed so far: https://github.com/numpy/numpy/pull/24693.
>
> In https://github.com/numpy/numpy/pull/24620 the NEP itself is being
> updated for changes that have been made. And it will mark the NEP as
> Accepted, which seems about time given that a lot of the work has already
> been merged.
>
> If there are no substantive objections within 7 days from this email, then
> the NEP will be accepted; see NEP 0 for more details.
>
> Cheers,
> Ralf
>
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: dom.grigo...@gmail.com
>
>
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@googlemail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: NEP 55 - Add a UTF-8 Variable-Width String DType to NumPy

2023-09-20 Thread Ralf Gommers
On Wed, Sep 20, 2023 at 8:26 AM Warren Weckesser 
wrote:

>
>
> On Fri, Sep 15, 2023 at 3:18 PM Warren Weckesser <
> warren.weckes...@gmail.com> wrote:
> >
> >
> >
> > On Mon, Sep 11, 2023 at 12:25 PM Nathan 
> wrote:
> >>
> >>
> >>
> >> On Sun, Sep 3, 2023 at 10:54 AM Warren Weckesser <
> warren.weckes...@gmail.com> wrote:
> >>>
> >>>
> >>>
> >>> On Tue, Aug 29, 2023 at 10:09 AM Nathan 
> wrote:
> >>> >
> >>> > The NEP was merged in draft form, see below.
> >>> >
> >>> > https://numpy.org/neps/nep-0055-string_dtype.html
> >>> >
> >>> > On Mon, Aug 21, 2023 at 2:36 PM Nathan 
> wrote:
> >>> >>
> >>> >> Hello all,
> >>> >>
> >>> >> I just opened a pull request to add NEP 55, see
> https://github.com/numpy/numpy/pull/24483.
> >>> >>
> >>> >> Per NEP 0, I've copied everything up to the "detailed description"
> section below.
> >>> >>
> >>> >> I'm looking forward to your feedback on this.
> >>> >>
> >>> >> -Nathan Goldbaum
> >>> >>
> >>>
> >>> This will be a nice addition to NumPy, and matches a suggestion by
> >>> @rkern (and probably others) made in the 2017 mailing list thread;
> >>> see the last bullet of
> >>>
> >>>
> https://mail.python.org/pipermail/numpy-discussion/2017-April/076681.html
> >>>
> >>> So +1 for the enhancement!
> >>>
> >>> Now for some nitty-gritty review...
> >>
> >>
> >> Thanks for the nitty-gritty review! I was on vacation last week and
> haven't had a chance to look over this in detail yet, but at first glance
> this seems like a really nice improvement.
> >>
> >> I'm going to try to integrate your proposed design into the dtype
> prototype this week. If that works, I'd like to include some of the text
> from the README in your repo in the NEP and add you as an author, would
> that be alright?
> >
> >
> >
> > Sure, that would be fine.
> >
> > I have a few more comments and questions about the NEP that I'll finish
> up and send this weekend.
> >
>
> One more comment on the NEP...
>
> My first impression of the missing data API design is that
> it is more complicated than necessary. An alternative that
> is simpler--and is consistent with the pattern established for
> floats and datetimes--is to define a "not a string" value, say
> `np.nastring` or something similar, just like we have `nan` for
> floats and `nat` for datetimes. Its behavior could be what
> you called "nan-like".
>

Float `np.nan` and datetime missing value sentinel are not all that
similar, and the latter was always a bit questionable (at least partially
it's a left-over of trying to introduce generic missing value support I
believe). `nan` is a float and part of C/C++ standards with well-defined
numerical behavior. In contrast, there is no `np.nat`; you can retrieve a
sentinel value with `np.datetime64("NaT")` only. I'm not sure if it's
possible to generate a NaT value with a regular operation on a datetime
array a la `np.array([1.5]) / 0.0`.

The handling of `np.nastring` would be an intrinsic part of the
> dtype, so there would be no need for the `na_object` parameter
> of `StringDType`. All `StringDType`s would handle `np.nastring`
> in the same consistent manner.
>
> The use-case for the string sentinel does not seem very
> compelling (but maybe I just don't understand the use-cases).
> If there is a real need here that is not covered by
> `np.nastring`, perhaps just a flag to control the repr of
> `np.nastring` for each StringDType instance would be enough?
>

My understanding is that the NEP provides the necessary but limited support
to allow Pandas to adopt the new dtype. The scope section of the NEP says:
"Fully agreeing on the semantics of a missing data sentinels or adding a
missing data sentinel to NumPy itself.". And then further down:
"By only supporting user-provided missing data sentinels, we avoid
resolving exactly how NumPy itself should support missing data and the
correct semantics of the missing data object, leaving that up to users to
decide"

That general approach I agree with, it's a large can of worms and not the
main purpose of this NEP. Nathan may have more thoughts about what, if
anything, from your suggestions could be adopted, but the general "let's
introduce a missing value thing" is a path we should not go down here imho.



>
> If there is an objection to a potential proliferation of
> "not a thing" special values, one for each type that can
> handle them, then perhaps a generic "not a value" (say
> `np.navalue`) could be created that, when assigned to an
> element of an array, results in the appropriate "not a thing"
> value actually being assigned. In a sense, I guess this NEP is
> proposing that, but it is reusing the floating point object
> `np.nan` as the generic "not a thing" value
>

It is explicitly not using `np.nan` but instead allowing the user to
provide their preferred sentinel. You're probably referring to the example
with `na_object=np.nan`, but that example would work with another sentinel
value too.

Cheers,
Ralf



> , and my preference
> is that, *if* we go with such 

[Numpy-discussion] Re: Assessment of the difficulty in porting CPU architecture for numpy

2023-11-17 Thread Ralf Gommers
Hi Xuanbao,

On Thu, Nov 16, 2023 at 2:59 PM xuanbao via NumPy-Discussion <
numpy-discussion@python.org> wrote:

> Hello everyone! I am working on implementing a tool to assess the
> complexity of CPU architecture porting. It primarily focuses on RISC-V
> architecture porting. In fact, the tool may have an average estimate of
> various architecture porting efforts.My focus is on the overall workload
> and difficulty of transplantation in the past and future,even if a project
> has already been ported.As part of my dataset, I have collected the
> **numpy** project. **I would like to gather community opinions to support
> my assessment. I appreciate your help and response!** Based on scanning
> tools, the porting complexity is determined to be moderate leaning towards
> simple, with a moderate amount of code related to the CPU architecture in
> the project.  Is this assessment accurate?


"moderate" sounds reasonable. It depends though if you want to only get
everything to work, without CPU-specific optimizations, or also support
SIMD code. If the former, I'd say moderate - you still have to deal with a
lot of C code and platform-specific build stuff. If the latter, the effort
is high.


> Do you often have any opinions on personnel allocation and consumption
> time?


That's very difficult to estimate. It depends on both the experience of the
developer and how many platform-specific issues there will be. I'd pick a
single person who can deal with C/C++ related issues, and go from there.
Could be done in a day or two, or could take weeks to months.

Cheers,
Ralf



> I look forward to your help and response.
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@googlemail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Meson - C extension - Finding numpy includes in virtual env

2023-11-27 Thread Ralf Gommers
On Sun, Nov 26, 2023 at 9:06 PM Nathan  wrote:

> I want to caution about using `pip install -e .` to get a development
> install of numpy. This will work fine working on numpy itself, but won’t be
> useful if you need to use the development version of numpy to build another
> library. This doesn’t work because in-place installs don’t install the
> numpy headers (arguably it was a bug that the old setuptools install did)
> into the git repo, so the include paths `np.get_include()` reports won’t be
> correct.
>

This sounds about right - editable installs are only useful for working on
`numpy` itself, and perhaps basic pure Python packages on top. But editable
installs are fundamentally not complete installs, and should not be used
for developing a set of packages together that depend on each other. I'll
note that there's an active discussion (PEP 735 draft) which touched on a
"workspaces" concept for this.

Also, please do not ever use editable installs without
`--no-build-isolation`, that may lead to weird issues.

Cheers,
Ralf



>
> See this meson-python issue:
> https://github.com/mesonbuild/meson-python/issues/429
>
> For my work I tend to use a persistent build directory with build
> isolation disabled as discussed in the meson-python docs. This gives me
> fast rebuilds without using an in-place build. It does mean there’s a build
> and install step when you edit python code in numpy that would otherwise be
> unnecessary and sometimes the cache can go stale for reasons that aren’t
> totally obvious.
>
> In principle numpy could fix this by ensuring the headers get generated in
> the git repo in the place they’re supposed to be installed. I have no idea
> how hard it would be beyond that it would definitely require messing with
> the codegen scripts.
>
> On Sun, Nov 26, 2023 at 10:53 AM Stefan van der Walt via NumPy-Discussion <
> numpy-discussion@python.org> wrote:
>
>> Hi Doug,
>>
>> On Sun, Nov 26, 2023, at 06:29, Doug Turnbull wrote:
>>
>> To debug, I ran `pip install . --no-build-isolation` it worked (using
>> venv's numpy)
>>
>>
>> When developing NumPy, we typically build in the existing environment.
>> This is done either via `pip install -e .` (which installs hooks to trigger
>> a re-compile upon import), or via the spin tool (
>> https://github.com/scientific-python/spin), which have meson commands
>> pre-bundled:
>>
>> pip install spin
>> spin  # lists commands available
>>
>> Best regards,
>> Stéfan
>>
>> ___
>> NumPy-Discussion mailing list -- numpy-discussion@python.org
>> To unsubscribe send an email to numpy-discussion-le...@python.org
>> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
>> Member address: nathan12...@gmail.com
>>
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@gmail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Telling meson build which CBLAS/LAPACK (LAPACKE?) to use via pkgconfig module

2023-11-27 Thread Ralf Gommers
On Mon, Nov 27, 2023 at 2:10 PM Dr. Thomas Orgis <
thomas.or...@uni-hamburg.de> wrote:

> Hi,
>
> I'm involved in packaging NumPy for http://pkgsrc.org/. We install a
> set of possible BLAS/CBLAS/LAPACK/LAPACKE packages side-by-side in the
> same prefix. This includes multiple variants of OpenBLAS with regard to
> multithreading (and indexing). For this purpose, we point software to
> use the build-time chosen BLAS implementation via BLAS_LIBS and similar
> variables, or, as seems to be appropriate for the new meson build of
> NumPy, via .pc files.
>
> A package depends on the generic BLAS library family and central user
> configuration chooses which one the packages should use during build.
>
> What would be the correct way to force the NumPy build to just use our
> BLAS choice, avoiding any automatisms that might surprise us?
>
> How agnostic is NumPy regarding the offered BLAS? Does it have to know
> that it is using OpenMP-parallelized OpenBLAS vs. the serial one (I'd
> imagine just setting OMP_NUM_THREADS handles paralellism)


The NumPy build does not know anything about this. It will just build, and
it will simply call the OpenBLAS functionality - whether those execute
under the hood in parallel or not, or with OpenMP or pthreads, is unknown.
When a user or downstream library wants to control that parallelism, they
can use an environment variable or https://github.com/joblib/threadpoolctl.

or MKL, Netlib, … ?


By default, the NumPy build will try all of those, in the order given by
the `blas-order` and `lapack-order` build options (see `meson_options.txt
in the root of the repo).

It doesn't scale to have to tell it "openblas" or "netlib",
> as there is no universal vocabulary to name the variants, and NumPy
> doesn't even know openblas_openmp from serial openblas or
> openblas_pthread (right?).
>
> Basically, I want to do
>
> meson setup -Dcblas_pc=$CBLAS_PC
>
> with CBLAS_PC being the module name of one of
>
> $prefix/lib/pkgconfig/cblas.pc
> $prefix/lib/pkgconfig/openblas_pthread.pc
> $prefix/lib/pkgconfig/openblas_openmp.pc
> $prefix/lib/pkgconfig/openblas64_openmp.pc
>
> so that pkg-config does its thing without the NumPy build guessing
> around. Is that feasible already? Is it easily supportable with some
> changes to the build? I dream of a world where package build scripts
> don't have to add dozens of idiosyncratic lines to detect these libs.
>

Yes, that is possible. You should be building with a build frontend (pip or
pypa/build) and then the invocation will include `-C-Dblas=
-C-Dlapack=`. See
http://scipy.github.io/devdocs/building/blas_lapack.html for more guidance.


> I'd like things to work like for CMake's FindBLAS with
> -DBLA_PREFER_PKGCONFIG and -DBLA_PKGCONFIG_BLAS=$BLAS_PC (see
>
> https://cmake.org/cmake/help/latest/module/FindBLAS.html
>
> since version 3.25).
>
> Can we have that?
>

Yes, that is implemented since NumPy 1.26.2 and in the main branch.


>
> And: NumPy needs CBLAS … does it also need LAPACKE instead of LAPACK?
>

No need for LAPACKE.


> These are differing libraries, possibly coming in differing binaries,
> even if your OpenBLAS builds also combine them. So I guess it should be
> -Dcblas_pc and -Dlapacke_pc, both being possibly identical. A build of
> the reference Netlib implementation provides four distinct libraries
> and .pc files:
>
> $prefix/lib/pkgconfig/cblas.pc
> $prefix/lib/pkgconfig/blas.pc
> $prefix/lib/pkgconfig/lapacke.pc
> $prefix/lib/pkgconfig/lapack.pc
>
> We do support installing openblas64 and friends alongside the others
> and I imagine just setting an ILP64 option and repective symbol suffix
> (none as of yet, as it's not a settled thing upstream) for the NumPy
> build if a 64 variant is chosen by the user. I wonder a bit if there
> are possible pitfalls combining other libraries with Python and
> indirectly some incompatible BLAS variant via NumPy … but one point of
> our user choice is that they could ensure that all packages really use
> the same BLAS.
>

You have to opt in to ILP64, via a `-Duse-ilp64` flag. It will not work to
craft a blas.pc which points at a 64-bit BLAS.

Cheers,
Ralf



>
>
> Alrighty then,
>
> Thomas
>
> --
> Dr. Thomas Orgis
> HPC @ Universität Hamburg
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@googlemail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Telling meson build which CBLAS/LAPACK (LAPACKE?) to use via pkgconfig module

2023-11-27 Thread Ralf Gommers
On Mon, Nov 27, 2023 at 6:51 PM Dr. Thomas Orgis <
thomas.or...@uni-hamburg.de> wrote:

> Am Mon, 27 Nov 2023 14:58:45 +0100
> schrieb Ralf Gommers :
>
> > The NumPy build does not know anything about this. It will just build,
> and
> > it will simply call the OpenBLAS functionality
>
>
> Great!
>
> > Yes, that is possible. You should be building with a build frontend (pip
> or
> > pypa/build) and then the invocation will include `-C-Dblas= name>
> > -C-Dlapack=`.
>
> I'm confused about these frontends, I must say. I imagined that if
> you're using meson, one could just call meson setup/build? That being
> said: I am not sure now myself how the pkgsrc build actually works
> right now. There's common machinery to 'build python stuff' and the
> part about meson-based packages is rather fresh and not documented yet.
>

You have to go through a "build frontend" to produce a wheel, which then
gets installed/repackaged for your distro. If you call meson/ninja
directly, you will not get the Python package metadata that meson-python
produces. And you do need that, others there are some things that will
break (e.g., using `importlib` APIs for introspecting Python packages). So
what your machinery should be doing is building with `pip install .
--no-build-isolation` or `python -m build --no-isolation`.


>
> The build output starts with
>
> * Building wheel...
> + /data/pkg/bin/python3.11
> /data/projekte/pkgsrc/work/math/py-numpy/work/numpy-1.26.2/vendored-meson/meson/meson.py
> setup /data/projekte/pkgsrc/work/math/py-numpy/work/numpy-1.26.2
> /data/projekte/pkgsrc/work/math/py-numpy/work/numpy-1.26.2/.mesonpy-_lv
>
> … so some wrapped call to a vendored copy of meson that NumPy ships.
>

Yes, we need that for extra BLAS/LAPACK and SIMD related functionality that
is still in the process of being upstreamed into Meson.


> Adding -Dblas=$CBLAS_PC to that command should do the trick, no?
> (however that is effected)
>

Sounds like it, assuming CBLAS_PC is the name of a library.


>
> > > And: NumPy needs CBLAS … does it also need LAPACKE instead of LAPACK?
> > >
> >
> > No need for LAPACKE.
>
> Good, if also somewhat weird;-) I'm curious, though: When you need the
> CBLAS API, why is the dependency called blas and not cblas? In
> practice, most accelerated libraries offer all APIs in one binary and
> -Dlapack is already redundant, but when we use the netlib reference,
> blas, cblas, lapack, and lapacke are distinct entities. Calling cblas
> just blas where lapack _does_ mean the Fortran one, is rather confusing.
>

Partly a matter of history since we always did it like this, but I think
there's more to it. The two libraries are called BLAS and LAPACK, those
offer distinct functionality. CBLAS and LAPACKE are basically much less
important implementation details, and typically shipped in the same library
because they're interfaces to the exact same functionality. We're not
"calling CBLAS just BLAS" here, but rather: BLAS is the main name and has
the functionality you want. CBLAS is an optional interface to that, and if
you want it you have to ask for it with (in Meson):

dependency('blas', modules: ['cblas'])

It doesn't make much sense for us to expose CBLAS (or LAPACKE) as a
separate thing in our own build interface.


> > You have to opt in to ILP64, via a `-Duse-ilp64` flag. It will not work
> to
> > craft a blas.pc which points at a 64-bit BLAS.
>
> So -Dblas=openblas64 -Dlapack=openblas64 -Duse-ilp64 would do it, right?
>

Exactly.



>
>
> Alrighty then,
>
> Thomas
>
> PS: You might want to fix that one:
>
> ../../numpy/meson.build:124: WARNING: Project targets '>=1.2.99' but uses
> feature introduced in '1.3.0': dep 'blas' custom lookup.
>

Yeah, that'll go away when we update the vendored copy, will be done in the
next few days.

Cheers,
Ralf


> --
> Dr. Thomas Orgis
> HPC @ Universität Hamburg
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@gmail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Telling meson build which CBLAS/LAPACK (LAPACKE?) to use via pkgconfig module

2023-12-06 Thread Ralf Gommers
On Wed, Dec 6, 2023 at 1:32 AM Dr. Thomas Orgis 
wrote:

> Am Sun, 3 Dec 2023 19:54:10 +0100
> schrieb "Dr. Thomas Orgis" :
>
> > > You have to go through a "build frontend" to produce a wheel, which
> then
> > > gets installed/repackaged for your distro.
> >
> > This is obviously happening in pkgsrc.
>
> >
> > I'll do some testing tomorrow, at least.
>
> Well, now is another day. Pkgsrc uses python -m build and I added
>
> -Csetup-args=-Dblas=${CBLAS_PC} -Csetup-args=-Dlapack=${LAPACK_PC}
>
> which seems to work out fine using cblas.pc and lapack.pc in the case
> of the netlib install. In fact, most linking is done only to libblas.so
> instead of libcblas.so, as the linker is smart enough to throw away the
> unused lib.
>

Great, thanks for confirming!

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] updated 2.0-dev nightlies, including macOS Accelerate wheels

2023-12-22 Thread Ralf Gommers
Hi all,

We had some issues with nightlies, the macOS, Linux aarch64 and PyPy ones
were about a month out of date. That is fixed now, new nightlies for all
supported platforms are up on
https://anaconda.org/scientific-python-nightly-wheels/numpy.

Note that a lot changed in `main` over the last month, so if you see lots
of failures in pre-release CI jobs of downstream packages, getting all of
those changes at the same time will be why. The 2.0 migration guide should
be mostly up to date, and covers the most important changes:
https://numpy.org/devdocs/numpy_2_0_migration_guide.html.

In addition, we have new wheels for arm64 macOS >=14.0 which use the new
Apple Accelerate rather than OpenBLAS. They're ~3x smaller and linalg
operations are significantly faster as a result.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] NEP 54 - SIMD infrastructure evolution to C++ and adopting Google Highway

2023-12-31 Thread Ralf Gommers
Hi all,

We have just NEP 54, "SIMD infrastructure evolution: adopting Google
Highway when moving to C++?", with Draft status after a long review at
https://github.com/numpy/numpy/pull/24138. It looks like it wasn't sent to
this list before.

Please see https://numpy.org/neps/nep-0054-simd-cpp-highway.html for the
rendered version (complete text below).

This is a complex topic, and the NEP captures more a discussion on the pros
and cons of moving to Highway, and in what form. Most folks active in
working on SIMD code in NumPy have weighed in in one of several calls, in
the community meeting and the 3-weekly meeting of the recently formed NumPy
Optimization Team. I think we can summarize the current status as follows:

- Google Highway is now included in the main repo as a git submodule
- We are +1 on using Highway for high-level operations where possible given
accuracy constraints, and are already doing so for sorting functionality.
- We are -1 on using Highway's dynamic dispatch, we prefer to stay with the
current dynamic dispatch support via build system support, which has worked
well for us for ~4 years now.
- We are +0 to +0.5 on using Highway's form of 'universal intrinsics', in
preference of moving our own universal intrinsics from C to C++. Both would
be a major improvement on the current state of our C implementation.
- For that latter decision, there isn't complete consensus on it, and also
Highway is missing a few things that NumPy does have that we'd like to see
it gain. In particular, a way to prototype and test new SIMD intrinsics
from Python (see
https://numpy.org/neps/nep-0054-simd-cpp-highway.html#the-simd-unit-testing-module
).

Cheers,
Ralf


full text of the NEP:

===
NEP 54 — SIMD infrastructure evolution: adopting Google Highway when moving
to C++?
===

:Author: Sayed Adel, Jan Wassenberg, Matti Picus, Ralf Gommers, Chris
Sidebottom
:Status: Draft
:Type: Standards Track
:Created: 2023-07-06
:Resolution: TODO


Abstract


We are moving the SIMD intrinsic framework, Universal Intrinsics, from C to
C++. We have also moved to Meson as the build system. The Google Highway
intrinsics project is proposing we use Highway instead of our Universal
Intrinsics as described in `NEP 38`_. This is a complex and multi-faceted
decision - this NEP is an attempt to describe the trade-offs involved and
what would need to be done.

Motivation and Scope


We want to refactor the C-based Universal Intrinsics (see :ref:`NEP 38
`) to C++. This work was ongoing for some time, and Google's Highway
was suggested as an alternative, which was already written in C++ and had
support for scalable SVE and other reusable components (such as VQSort).

The move from C to C++ is motivated by (a) code readability and ease of
development, (b) the need to add support for sizeless SIMD instructions
(e.g.,
ARM's SVE, RISC-V's RVV).

As an example of the readability improvement, here is a typical line of C
code
from our current C universal intrinsics framework:

.. code::

   // The @name@ is the numpy-specific templating in .c.src files
   npyv_@sfx@  a5 = npyv_load_@sfx@(src1 + npyv_nlanes_@sfx@ * 4);

This will change (as implemented in PR `gh-21057`_) to:

.. code:: C++

   auto a5 = Load(src1 + nlanes * 4);

If the above C++ code were to use Highway under the hood it would look quite
similar, it uses similarly understandable names as ``Load`` for individual
portable intrinsics.

The ``@sfx`` in the C version above is the template variable for type
identifiers, e.g.: ``#sfx = u8, s8, u16, s16, u32, s32, u64, s64, f32,
f64#``.
Explicit use of bitsize-encoded types like this won't work for sizeless SIMD
instruction sets. With C++ this is easier to handle; PR `gh-21057`_ shows
how
and contains more complete examples of what the C++ code will look like.

The scope of this NEP includes discussing most relevant aspects of adopting
Google Highway to replace our current Universal Intrinsics framework,
including
but not limited to:

- Maintainability, domain expertise availability, ease of onboarding new
  contributor, and other social aspects,
- Key technical differences and constraints that may impact NumPy's internal
  design or performance,
- Build system related aspects,
- Release timing related aspects.

Out of scope (at least for now) is revisiting other aspects of our current
SIMD
support strategy:

- accuracy vs. performance trade-offs when adding SIMD support to a function
- use of SVML and x86-simd-sort (and possibly its equivalents for aarch64)
- pulling in individual bits or algorithms of Highway (as in `gh-24018`_) or
  SLEEF (as discussed in that same PR)


Usage and Impact


N/A - there will be no significant user-visible changes.


Backward 

[Numpy-discussion] Fwd: incomplete BLAS/CBLAS linking (Telling meson build which CBLAS/LAPACK (LAPACKE?) to use via pkgconfig module)

2023-12-31 Thread Ralf Gommers
(I took this off-list unintentionally, so I'm forward each email to the
list now)



-- Forwarded message -
From: Ralf Gommers 
Date: Thu, Dec 28, 2023 at 8:51 PM
Subject: Re: incomplete BLAS/CBLAS linking (Telling meson build which
CBLAS/LAPACK (LAPACKE?) to use via pkgconfig module)
To: Dr. Thomas Orgis 




On Mon, Dec 25, 2023 at 10:37 AM Dr. Thomas Orgis <
thomas.or...@uni-hamburg.de> wrote:

> Hapy holidays … but I have an issue still that hopefully can be
> addressed with the meson blas detection you are upstreaming(?).
>

Happy holidays to you too. And yes, I hope so:)


>
> Am Wed, 6 Dec 2023 18:06:01 +0100
> schrieb Ralf Gommers :
>
> > > Well, now is another day. Pkgsrc uses python -m build and I added
> > >
> > > -Csetup-args=-Dblas=${CBLAS_PC}
> -Csetup-args=-Dlapack=${LAPACK_PC}
> > >
> > > which seems to work out fine using cblas.pc and lapack.pc in the case
> > > of the netlib install. In fact, most linking is done only to libblas.so
> > > instead of libcblas.so, as the linker is smart enough to throw away the
> > > unused lib.
> > >
> >
> > Great, thanks for confirming!
>
> This works for numpy and also installs scipy nicely, but this produces
> a broken scipy install when using netlib reference libraries from
> pkgsrc. These come as
>
> libblas.so
> liblapack.so (NEEDing libblas.so)
> libcblas.so (NEEDing libblas.so)
> libpapacke.so (NEEDing liblapack.so, hence libblas.so)
>
> and their respective .pc files. This is the natural order that occus to
> me when building from netlib upstream.


This should work fine. It's auto-detected in NumPy already, and will be in
SciPy in the future. For now, using `-Dblas=blas -Dlapack=lapack` in the
SciPy build should work.


> This also means that one could
> just replace BLAS and put stock LAPACK on top, what optimized BLAS libs
> usually start out with.


This is indeed possible, if unusual. It's supported and one reason for why
we have separate `blas` and `lapack` flags. I'd discourage distros from
shipping something like that by default though, since it tends to lead to
problems. Arch Linux used to do this, shipping an OpenBLAS without LAPACK
symbols. Luckily they finally fixed that. Shipping non-default build
configs like that is invariably a bad idea, and should only be done if
there's a pressing need.

Only that they tend to pack all symbols into
> one common library, which then project builds like numpy rely on.
>
> Telling the meson build that BLAS is libcblas works as long as actually
> CBLAS symbols are used.


Please never do this. The library is BLAS, so you should use `-Dblas=blas`
for NumPy. It will find `cblas` just fine that way.


> If not — I presume now, as I didn't yet see the
> actual build lines that are triggered via the python -m build and meson
> indirections — the linker might discard the -lcblas and leave symbols
> unresolved (--as-needed but no --no-undefined).
>
> This happens with scipy:
>
> $ LANG=C readelf -d
> /data/pkg/lib/python3.11/site-packages/scipy/sparse/linalg/_dsolve/_superlu.so
> |grep NEEDED
>  0x0001 (NEEDED) Shared library: [libm.so.6]
>  0x0001 (NEEDED) Shared library: [libc.so.6]
>

This is probably a bug in SciPy. The build target depends on both `blas`
and `lapack`, and sets `DUSE_VENDOR_BLAS=1`. However, it looks like it
should depend on `cblas`. If you add `cblas` to this line, I think it'll
fix the issue:
https://github.com/scipy/scipy/blob/6452a48c9611d16140b160091de6cf5299fadd9f/scipy/sparse/linalg/_dsolve/meson.build#L208
.


> It would link against libopenblas_openmp.so if that had been the CBLAS
> (and LAPACK) choice and all would be fine, but here, it should link
> with libcblas.so, or directly to libblas.so, just like our regular
> install of superlu:
>
> $ LANG=C readelf -d /data/pkg/lib/libsuperlu.so|grep NEEDED
>  0x0001 (NEEDED) Shared library: [libblas.so.3]
>  0x0001 (NEEDED) Shared library: [libm.so.6]
>  0x0001 (NEEDED) Shared library: [libc.so.6]
>
> Of course, just not vendoring superlu would be one solution for scipy,
> but I think the deeper issue with the meson BLAS support should be
> solved: The 4 parts of the BLAS canon (not talking about SCALAPACK etc.
> yet) need to be handled explicitly.
>
> It is confusing, though, as meson prints this:
>
> Run-time dependency blas found: YES 3.11.0
> Run-time dependency cblas found: YES 3.11.0
> Run-time dependency lapack found: YES 3.11.0
>
> It suggests that it looked for and found 3 libraries, but actually, it
> only cared for -llapack and -lcblas. It needs to find -lblas dire

[Numpy-discussion] Fwd: incomplete BLAS/CBLAS linking (Telling meson build which CBLAS/LAPACK (LAPACKE?) to use via pkgconfig module)

2023-12-31 Thread Ralf Gommers
-- Forwarded message -
From: Dr. Thomas Orgis 
Date: Fri, Dec 29, 2023 at 12:00 AM
Subject: Re: incomplete BLAS/CBLAS linking (Telling meson build which
CBLAS/LAPACK (LAPACKE?) to use via pkgconfig module)
To: Ralf Gommers 


Am Thu, 28 Dec 2023 20:51:27 +0100
schrieb Ralf Gommers :

> > libblas.so
> > liblapack.so (NEEDing libblas.so)
> > libcblas.so (NEEDing libblas.so)
> > libpapacke.so (NEEDing liblapack.so, hence libblas.so)
> >
> > and their respective .pc files. This is the natural order that occus to
> > me when building from netlib upstream.
>
>
> This should work fine. It's auto-detected in NumPy already, and will be in
> SciPy in the future. For now, using `-Dblas=blas -Dlapack=lapack` in the
> SciPy build should work.

I noticed that with -Dblas=blas, which is in pkgsrc now. The detection
code sets cblas and finds libcblas by dark magic / defaults that happen
to match. But what if my setup uses -Dblas=netlib_blas? Then the
internal guesswork would fail.

Please consider a mode where the user specifies separate names for all
4 components. For package builds, we do not want any guess work,
including assuming that libblas.so is accompanied by libcblas.so with
that exact name.

So I'd like

-Dblas=$BLAS_PACKAGE -Dcblas=$CBLAS_PACKAGE \
-Dlapack=$LAPACK_PACKAGE -Dlapacke=$LAPACKE_PACKAGE

where the values may all be the same or not. If I fail to provide one
of those, feel free to guess for the rest (for example, assuming/trying
that all of those are openblas if I say -Dblas=openblas).

I also realized that including LAPACK in OpenBLAS is needed, but any
new BLAS code could start out just replacing the netlib piece by piece.
The partitioning is there and it is probably good for managing the
complexity, limiting scope of the individual libraries.

> > Telling the meson build that BLAS is libcblas works as long as actually
> > CBLAS symbols are used.
>
>
> Please never do this. The library is BLAS, so you should use `-Dblas=blas`
> for NumPy. It will find `cblas` just fine that way.

Oh. As I wrote before, we now have

-Csetup-args=-Dblas=${CBLAS_PC}
-Csetup-args=-Dlapack=${LAPACK_PC}

for math/py-numpy. That's CBLAS_PC, not BLAS_PC. And this works.

> This is probably a bug in SciPy.

Well, apparently its just a miscommunication between us two. Scipy is fine
with

-Csetup-args=-Dblas=${BLAS_PC}
-Csetup-args=-Dlapack=${LAPACK_PC}

locating licblas by inferring it from libblas, and finding cblas in
openblas_foobar, apparently. It prints those lines:

Run-time dependency blas found: YES 3.11.0
Run-time dependency cblas found: YES 3.11.0
Run-time dependency lapack found: YES 3.11.0
blas: blas
lapack  : lapack

While the numpy build does this:

Run-time dependency cblas found: YES 3.11.0
Message: BLAS symbol suffix:
Run-time dependency lapack found: YES 3.11.0
blas: cblas
lapack  : lapack

This looks similar to the case of openblas_openmp for -Dblas and -Dlapack:

Run-time dependency openblas_openmp found: YES 0.3.24
Message: BLAS symbol suffix:
Run-time dependency openblas_openmp found: YES 0.3.24
blas: openblas_openmp
lapack  : openblas_openmp

So scipy locates cblas based on the name blas, but doesn't really use
cblas. Numpy is happy with libcblas bringing libblas in and calls it
blas, but really uses the cblas interface. This looks a bit confusing.


I guess it makes more sense to continue that discussion on the meson
PRs for this functionality … as it transcends NumPy, anyway. I hope we
can settle on something that works for autodetection and prescription
of all parts.

And I need to ponder if I leave it at -Dblas=$CBLAS_PC for pkgsrc now.
It's somewhat wrong, but also more correct, as NumPy _really_ means to
use CBLAS API, not BLAS.


Alrighty then,

Thomas

-- 
Dr. Thomas Orgis
HPC @ Universität Hamburg
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: incomplete BLAS/CBLAS linking (Telling meson build which CBLAS/LAPACK (LAPACKE?) to use via pkgconfig module)

2023-12-31 Thread Ralf Gommers
(re-sending to list)

On Fri, Dec 29, 2023 at 11:34 AM Ralf Gommers 
wrote:

>
>
> On Fri, Dec 29, 2023 at 12:00 AM Dr. Thomas Orgis <
> thomas.or...@uni-hamburg.de> wrote:
>
>> Am Thu, 28 Dec 2023 20:51:27 +0100
>> schrieb Ralf Gommers :
>>
>> > > libblas.so
>> > > liblapack.so (NEEDing libblas.so)
>> > > libcblas.so (NEEDing libblas.so)
>> > > libpapacke.so (NEEDing liblapack.so, hence libblas.so)
>> > >
>> > > and their respective .pc files. This is the natural order that occus
>> to
>> > > me when building from netlib upstream.
>> >
>> >
>> > This should work fine. It's auto-detected in NumPy already, and will be
>> in
>> > SciPy in the future. For now, using `-Dblas=blas -Dlapack=lapack` in the
>> > SciPy build should work.
>>
>> I noticed that with -Dblas=blas, which is in pkgsrc now. The detection
>> code sets cblas and finds libcblas by dark magic / defaults that happen
>> to match. But what if my setup uses -Dblas=netlib_blas? Then the
>> internal guesswork would fail.
>>
>
> If the library name is libcblas.so it will still be found. If it's also a
> nonstandard name, then yes it's going to fail. I'd say though that (a) this
> isn't a real-world situation as far as we know, (b) just don't do this as a
> packager, and (c) if you really must, you can still make it work by
> providing a custom `cblas.pc` (see
> http://scipy.github.io/devdocs/building/blas_lapack.html#using-pkg-config-to-detect-libraries-in-a-nonstandard-location
> ).
>
> Please consider a mode where the user specifies separate names for all
>> 4 components. For package builds, we do not want any guess work,
>> including assuming that libblas.so is accompanied by libcblas.so with
>> that exact name.
>>
>> So I'd like
>>
>> -Dblas=$BLAS_PACKAGE -Dcblas=$CBLAS_PACKAGE \
>> -Dlapack=$LAPACK_PACKAGE -Dlapacke=$LAPACKE_PACKAGE
>>
>> where the values may all be the same or not. If I fail to provide one
>> of those, feel free to guess for the rest (for example, assuming/trying
>> that all of those are openblas if I say -Dblas=openblas).
>
>
> We don't use LAPACKE, so that one can be ignored. For CBLAS, I'd honestly
> rather get a bug report than add new CLI flags for a situation that seems
> to be purely hypothetical. Things work on all known distributions I
> believe, and this design isn't new but was the same that numpy.distutils
> uses. We can consider a new `-Dcblas` flag at any point, there is nothing
> in the design preventing us from adding it later. But I'd rather only do so
> if there's a real need.
>
>
>> I also realized that including LAPACK in OpenBLAS is needed, but any
>> new BLAS code could start out just replacing the netlib piece by piece.
>> The partitioning is there and it is probably good for managing the
>> complexity, limiting scope of the individual libraries.
>>
>> > > Telling the meson build that BLAS is libcblas works as long as
>> actually
>> > > CBLAS symbols are used.
>> >
>> >
>> > Please never do this. The library is BLAS, so you should use
>> `-Dblas=blas`
>> > for NumPy. It will find `cblas` just fine that way.
>>
>> Oh. As I wrote before, we now have
>>
>> -Csetup-args=-Dblas=${CBLAS_PC}
>> -Csetup-args=-Dlapack=${LAPACK_PC}
>>
>> for math/py-numpy. That's CBLAS_PC, not BLAS_PC. And this works.
>>
>
> I assume that it also passes if you'd pass in BLAS_PC?
>
>
>>
>> > This is probably a bug in SciPy.
>>
>> Well, apparently its just a miscommunication between us two. Scipy is
>> fine with
>>
>
> Phew:) I also just confirmed by writing a new SciPy CI job for the split
> Netlib BLAS situation, based on how OpenSUSE packages it. And that passes.
>
>
>>
>> -Csetup-args=-Dblas=${BLAS_PC}
>> -Csetup-args=-Dlapack=${LAPACK_PC}
>>
>> locating licblas by inferring it from libblas, and finding cblas in
>> openblas_foobar, apparently. It prints those lines:
>>
>> Run-time dependency blas found: YES 3.11.0
>> Run-time dependency cblas found: YES 3.11.0
>> Run-time dependency lapack found: YES 3.11.0
>> blas: blas
>> lapack  : lapack
>>
>> While the numpy build does this:
>>
>> Run-time dependency cblas found: YES 3.11.0
>> Message: BLAS symbol suffix:
>> Run-time dependency lapack found: YES 3.11.0
>> blas 

[Numpy-discussion] Re: incomplete BLAS/CBLAS linking (Telling meson build which CBLAS/LAPACK (LAPACKE?) to use via pkgconfig module)

2023-12-31 Thread Ralf Gommers
On Sat, Dec 30, 2023 at 1:57 PM Dr. Thomas Orgis <
thomas.or...@uni-hamburg.de> wrote:

>
> Am Fri, 29 Dec 2023 11:34:04 +0100
> schrieb Ralf Gommers :
>
> > If the library name is libcblas.so it will still be found. If it's also a
> > nonstandard name, then yes it's going to fail. I'd say though that (a)
> this
> > isn't a real-world situation as far as we know,
>
> It can be more funny. I just notied on an Ubuntu system (following
> Debian for sure, here) that there are both
>
> /usr/lib/x86_64-linux-gnu/libblas.so.3
> /usr/lib/x86_64-linux-gnu/libcblas.so.3
>
> but those belong to different packages. The first contains BLAS and
> CBLAS API and is installed from netlib code.
>
> $ readelf -d -s /usr/lib/x86_64-linux-gnu/libblas.so.3  | grep cblas_ | wc
> -l
> 184
>
> The second is installed alongside ATLAS.
>
> $ readelf -d -s /usr/lib/x86_64-linux-gnu/libcblas.so.3  | grep cblas_ |
> wc -l
> 154
>
> The symbols lists differ in that there are both functions unique to both.
>
> $ ldd /usr/lib/x86_64-linux-gnu/libcblas.so.3
> linux-vdso.so.1 (0x7ffcb572)
> libatlas.so.3 => /lib/x86_64-linux-gnu/libatlas.so.3
> (0x7fd9b27ee000)
> libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x7fd9b25c6000)
> libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x7fd9b24df000)
> /lib64/ld-linux-x86-64.so.2 (0x7fd9b2bae000)
>
> I _guess_ this situation would be mostly fine since libblas has enough
> of the CBLAS symbols to prevent location of the wrong libcblas next to
> it by the meson search.
>
> Quick followup regarding netlib splits. Debian only recently folded
> libcblas into libblas, as
>
> https://lists.debian.org/debian-devel/2019/10/msg00273.html
>
> notes. Not that long ago … esp. considering stable debian. Not sure
> when this appeared. And of course numpy is the point where things were
> broken:
>
> https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=913567
>
> I'm now looking into how Debian actually produces a combined BLAS+CBLAS
> from netlib, as we're using the CMake build system and I do not see an
> option to do that. The upstream build produces separate libraries, so I
> assumed that is a case that one should handle.


Yes, Debian made quite a mess there. We do have a CI job for Netlib on
Debain though in NumPy, and indeed it works fine because of the CBLAS
symbols already being found inside libblas.so


> But it is a demonstration that any guess that libcblas belongs to
> libblas just from the name may be wrong in real-world installations.
>

Letting this sink in some more, I realized the more fundamental reason for
treating them together: when we express dependencies, we do so for a
*package* (i.e., a packaged version of some project), not for a specific
build output like a shared library or a header file. In this case it's a
little obscured by BLAS being an interface and the libblas/libcblas mix,
but it's still the case that we're looking for multiple installed things
from a single package. So we want "MKL" or "Netlib BLAS", where MKL is not
only a shared library (or set of them), but for example also the
corresponding header file (mkl_cblas.h rather than cblas.h). The situation
you are worrying about is basically that of an unknown package with a set
of shared libraries and headers that have non-standard names. I'd say that
that's then simply a non-supported package, until someone comes to report
the situation and we can add support for it (or file a bug against that
package and convince the authors not to make such a mess).

I think this point is actually important, and I hope you can appreciate it
as a packager - we need to depend on packages (things that have URLs to
source repos, maintainers, etc.), not random library names.


>
> Here, it might be a strange installation remnant.
>
> $ dpkg -L libatlas3-base
> /.
> /usr
> /usr/lib
> /usr/lib/x86_64-linux-gnu
> /usr/lib/x86_64-linux-gnu/atlas
> /usr/lib/x86_64-linux-gnu/atlas/libblas.so.3.10.3
> /usr/lib/x86_64-linux-gnu/atlas/liblapack.so.3.10.3
> /usr/lib/x86_64-linux-gnu/libatlas.so.3.10.3
> /usr/lib/x86_64-linux-gnu/libcblas.so.3.10.3
> /usr/lib/x86_64-linux-gnu/libf77blas.so.3.10.3
> /usr/lib/x86_64-linux-gnu/liblapack_atlas.so.3.10.3
> /usr/share
> /usr/share/doc
> /usr/share/doc/libatlas3-base
> /usr/share/doc/libatlas3-base/README.Debian
> /usr/share/doc/libatlas3-base/changelog.Debian.gz
> /usr/share/doc/libatlas3-base/copyright
> /usr/lib/x86_64-linux-gnu/atlas/libblas.so.3
> /usr/lib/x86_64-linux-gnu/atlas/liblapack.so.3
> /usr/lib/x86_64-linux-gnu/libatlas.so.3
> /usr/lib/x86_64-linux-gnu/libcblas.so.3
>

[Numpy-discussion] NEP 56: array API standard support in the main numpy namespace

2024-01-07 Thread Ralf Gommers
Hi all,

Here is what is probably the second-to-last NEP for NumPy 2.0 (the last one
being the informational summary NEP of all major changes):
https://github.com/numpy/numpy/pull/25542. Full text below.

A lot of the work has been under discussion since the 2.0 developer meeting
in April and has been merged. A few of PRs that didn't make sense as
standalone changes without this NEP are still open (see the "NumPy 2.0 API
Changes" label), and there's a couple ones that still need to be opened.

For editorial comments on the text, please comment on GitHub. For
significant conceptual/design comments, please post them on this thread.

Cheers,
Ralf



=
NEP 56 — Array API standard support in NumPy's main namespace
=========

:Author: Ralf Gommers 
:Author: Mateusz Sokół 
:Author: Nathan Goldbaum 
:Status: Draft
:Type: Standards Track
:Created: 2023-12-19
:Resolution: TODO mailing list link


Abstract


This NEP proposes adding full support for the 2022.12 version of the array
API
standard in NumPy's main namespace for the 2.0 release.

Motivation and scope


.. note::

The main changes proposed in this NEP were presented in the NumPy 2.0
Developer Meeting in April 2023 (see `here
<
https://github.com/numpy/archive/blob/main/2.0_developer_meeting/NumPy_2.0_devmeeting_array_API_adoption.pdf
>`__
for presentations from that meeting) and given a thumbs up there. The
majority of the implementation work for NumPy 2.0 has already been
merged.
For the rest, PRs are ready - those are mainly the items that are
specific
to array API support and we'd probably not consider for inclusion in
NumPy
without that context. This NEP will focus on those APIs and PRs in a bit
more detail.

:ref:`NEP47` contains the motivation for adding array API support to NumPy.
This NEP expands on and supersedes NEP 47. The main reason NEP 47 aimed for
a
separate ``numpy.array_api`` submodule rather than the main namespace is
that
casting rules differed too much. With value-based casting being removed
(:ref:`NEP50`), that will be resolved in NumPy 2.0. Having NumPy be a
superset
of the array API standard will be a significant improvement for code
portability to other libraries (CuPy, JAX, PyTorch, etc.) and thereby
address
one of the top user requests from the 2020 NumPy user survey [4]_ (GPU
support).
See `the numpy.array_api API docs (1.26.x) <
https://numpy.org/doc/1.26/reference/array_api.html#table-of-differences-between-numpy-array-api-and-numpy
>`__
for an overview of differences between it and the main namespace (note that
the
"strictness" ones are not applicable).

Experiences with ``numpy.array_api``, which is still marked as experimental,
have shown that the separate strict implementation and separate array object
are mostly good for testing purposes, but not for regular usage in
downstream
libraries. Having support in the main namespace resolves this issue. Hence
this
NEP supersedes NEP 47. The ``numpy.array_api`` module will be moved to a
standalone package, to facilitate easier updates not tied to a NumPy release
cycle.

Some of the key design rules from the array API standard (e.g., output
dtypes
predictable from input dtypes, no polymorphic APIs with varying number of
returns controlled by keywords) will also be applied to NumPy functions that
are not part of the array API standard, because those design rules are now
understood to be good practice in general. Those two design rules in
particular
make it easier for Numba and other JIT compilers to support NumPy or
NumPy-compatible APIs. We'll note that making existing arguments
positional-only and keyword-only is a good idea for functions added to
NumPy in
the future, but will not be done for existing functions since each such
change
is a backwards compatibility break and it's not necessary for writing code
that
is portable across libraries supporting the standard. An additional reason
to
apply those design rules to all functions in the main namespace now is that
it
then becomes much easier to deal with potential standardization of new
functions already present in NumPy - those could otherwise be blocked or
forced
to use alternative function names due to the need for backwards
compatibility.

It is important that new functions added to the main namespace integrate
well
with the rest of NumPy. So they should for example follow broadcasting and
other rules as expected, and work with all NumPy's dtypes rather than only
the
ones in the standard. The same goes for backwards-incompatible changes
(e.g.,
linear algebra functions need to all support batching in the same way, and
consider the last two axes as matrices). As a result, NumPy should become
more
rather than less consistent.

We'll note that one remaining incompatibility will 

[Numpy-discussion] Re: Proposal to accept NEP 55: Add a UTF-8 variable-width string DType to NumPy

2024-01-24 Thread Ralf Gommers
On Wed, Jan 24, 2024 at 10:43 AM Sebastian Berg 
wrote:

> On Mon, 2024-01-22 at 17:08 -0700, Nathan wrote:
> > Hi all,
> >
> > I propose we accept NEP 55 and merge PR #25347 implementing the NEP
> > in time
> > for the NumPy 2.0 RC:
>
>
> I really like this work and I think it is a big improvement!  At this
> point we probably have to expect some things to be still buggy, but
> that is also a reason to get it in (testing is hard if it isn't shipped
> first-class unfortunately).
>

+1 to this. It's seen a ton of hard and careful work for about a year now,
and seems close to as ready as it's going to get pre-merging. So +1 to
accepting the NEP now and hitting the green button on your main PR.

Cheers,
Ralf


Nathan summarized the things I might have brought up very well.  The
> support of missing values is the one thing that to me may end up a bit
> more in flux.
> But I am happy to hope that this is in a way that pandas will not be
> affected and, honestly, without deep integration testing we won't make
> progress in figuring out whether there is some change needed or not.
>
> Thanks for the great work!
>
> - Sebastian
>
>
> >
> > https://numpy.org/neps/nep-0055-string_dtype.html
> > https://github.com/numpy/numpy/pull/25347
> >
> > The most controversial aspect of the NEP was support for missing
> > strings
> > via a user-supplied sentinel object. In the previous discussion on
> > the
> > mailing list, Warren Weckesser argued for shipping a missing data
> > sentinel
> > with NumPy for use with the DType, while in code review and the PR
> > for the
> > NEP, Sebestian expressed concern about the additional complexity of
> > including missing data support at all.
> >
> > I found that supporting missing data is key to efficiently supporting
> > the
> > new DType in Pandas. I think that argues that we need some level of
> > missing
> > data support to fully replace object string arrays. I believe the
> > compromise proposal in the NEP is sufficient for downstream libraries
> > while
> > limiting additional complexity elsewhere in NumPy.
> >
> > Concerns raised in previous discussions about concretely specifying
> > the C
> > API to be made public, preventing use-after-free errors in a
> > multithreaded
> > context, and uncertainty around the arena allocator implementation
> > have
> > been resolved in the latest version of the NEP and the open PR.
> > Additionally, due to some excellent and timely work by Lysandros
> > Nikolaou,
> > we now have a number of string ufuncs in NumPy and a straightforward
> > plan
> > to add more. Loops have been implemented for all the ufuncs added in
> > the
> > NumPy 2.0 dev cycle so far.
> >
> > I would like to see us ship the DType in NumPy 2.0. This will allow
> > us to
> > advertise a major new feature, will spur efforts to support new
> > DTypes in
> > downstream libraries, and will allow us to get feedback from the
> > community
> > that would be difficult to obtain without releasing the code into the
> > wild.
> > Additionally, I am funded via a NASA ROSES grant for work related to
> > this
> > effort until the end of 2024, so including the DType in NumPy 2.0
> > will more
> > efficiently use my funded time to fix issues.
> >
> > If there are no substantive objections to this email, then the NEP
> > will be
> > considered accepted; see NEP 0 for more details:
> > https://numpy.org/neps/nep-.html
> > ___
> > NumPy-Discussion mailing list -- numpy-discussion@python.org
> > To unsubscribe send an email to numpy-discussion-le...@python.org
> > https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> > Member address: sebast...@sipsolutions.net
>
>
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@googlemail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] welcome Raghuveer, Chris, Mateusz and Matt to the NumPy maintainers team

2024-01-26 Thread Ralf Gommers
Hi all,

We've got four new NumPy maintainers! Welcome to the team, and
congratulations to:

- Raghuveer Devulapalli (https://github.com/r-devulap)
- Chris Sidebottom (https://github.com/mousius)
- Mateusz Sokół (https://github.com/mtsokol/)
- Matt Haberland (https://github.com/mdhaber)

Raghuveer and Chris have been contribution to the effort on SIMD and
performance optimizations for quite a while now. Mateusz has done a lot of
the heavy lifting on the Python API improvements for NumPy 2.0 And Matt has
been contributing to the test infrastructure and docs.

Thanks to all four of you for the great work to date!

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: donation from the Bloomberg FOSS Contributor Fund

2024-01-31 Thread Ralf Gommers
On Wed, Jan 31, 2024 at 1:34 PM Inessa Pawson  wrote:

> The NumPy Steering Council is excited to announce that we have received a
> $10k grant from the Bloomberg FOSS Contributor Fund. We appreciate
> Bloomberg’s commitment to sustaining critical digital infrastructure and
> thank all the Bloomberg employees who nominated our project for this grant.
>

This is great news! Thanks to Bloomberg for the donation, and thank you
Inessa for leading the communications around accepting and announcing this
donation.

We don't currently have a concrete purpose for this particular donation by
itself, but it certainly helps the project to have funds to spend in a way
we think is most effective. It has enabled the creation of a fellowship
last year, and it makes it much easier to make decisions like spending some
money on keeping Cirrus CI running on platforms for which there aren't any
free options.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: ENH: Introducing a pipe Method for Numpy arrays

2024-02-16 Thread Ralf Gommers
On Fri, Feb 16, 2024 at 12:40 AM Marten van Kerkwijk 
wrote:

> > From my experience, calling methods is generally faster than
> > functions. I figure it is due to having less overhead figuring out the
> > input. Maybe it is not significant for large data, but it does make a
> > difference even when working for medium sized arrays - say float size
> > 5000.
> >
> > %timeit a.sum()
> > 3.17 µs
> > %timeit np.sum(a)
> > 5.18 µs
>
> It is more that np.sum checks if there is a .sum() method and if so
> calls that.  And then `ndarray.sum()` calls `np.add.reduce(array)`.
>

Also note that np.sum does a bunch of work *in pure Python*. Some of that
Python code is really bad too, using `_wrapreduction` which has weird
semantics (trying `getattr(x, 'sum')` for any object) that we could/should
remove and that currently make the function even slower.

The large gap in performance has little to do with functions vs. methods,
more like the method being implemented in C and not having to defer to the
function, rather than the other way around.

Cheers,
Ralf



> In [2]: a = np.arange(5000.)
>
> In [3]: %timeit np.sum(a)
> 3.89 µs ± 411 ns per loop (mean ± std. dev. of 7 runs, 100,000 loops each)
>
> In [4]: %timeit a.sum()
> 2.43 µs ± 42 ns per loop (mean ± std. dev. of 7 runs, 100,000 loops each)
>
> In [5]: %timeit np.add.reduce(a)
> 2.33 µs ± 31 ns per loop (mean ± std. dev. of 7 runs, 100,000 loops each)
>
> Though I must admit I'm a bit surprised the excess is *that* large for
> using np.sum...  There may be a little micro-optimization to be found...
>
> -- Marten
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@gmail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Numpy wheels: who maintains the builds and who pays for it?

2024-02-26 Thread Ralf Gommers
On Thu, Feb 22, 2024 at 7:03 PM  wrote:

> Hi folks,
>
> My name is Sean and I'm the author of several GIS packages using Numpy:
> Fiona, Rasterio, and Shapely.


Hi Sean, thanks for this very good question, and for all your work on GIS
packages.


> I've followed Numpy's trail when it comes to wheel building for many years
> and now I'm seeking advice on how to prioritize platforms to support and
> how to pay for the labor and computing that it takes to build wheels and
> maintain the infrastructure over time.


I'm probably best placed to answer your questions, because I've both been
involved in NumPy build & packaging for a long time and am responsible for
overseeing a significant fraction of the funded work on NumPy as well as
coordinating unrestricted funding coming in (mostly via Tidelift, as you
can see at https://opencollective.com/numpy). I'll do my best to accurately
represent the situation for NumPy. Your questions are challenging though,
so if you want a higher-bandwidth conversation I'd be happy to chat. Or we
can use part of a community meeting for this, since I imagine other folks
may be interested in this topic as well.


> Fiona and Rasterio have an order of magnitude more C library dependencies
> than Numpy, via GDAL (https://gdal.org/), which is almost more of an OS
> than a library.
>

Dealing with NumPy's BLAS dependency is already a large amount of work, so
I don't envy your task. PyPI really isn't well-suited to that many C
libraries (as I'm sure you know); for a long time the geospatial stack was
only usable from conda-forge, where packaging is a much easier task. I'm
not sure that was a terrible situation - there are a couple of domains like
that where things just get too challenging. So if you want to do something
much more restricted than NumPy for platforms to support with wheels, that
seems perfectly okay.


> I found a thread in the archive about adding musllinux wheels, but it
> wasn't clear to me how the work gets done, who does, and how it gets paid
> for.



Of all work on NumPy, the funded part has increased steadily. Until ~2016
that fraction was zero, and now a lot of the heavy lifting is funded work -
9 out of 10 of the top 10 committers over the past 1.5 years get paid for
at least a part of their time spent on NumPy. This is supported in several
ways (partially documented at https://numpy.org/about, but that's a bit out
of date):

1. a number of grants received over the years, from: Moore and Sloan
Foundations (>$1M), the Chan Zuckerberg Institute (>$1M), and NASA (~$400k)
2. maintainers employed by companies who allow those maintainers to spend
part of their day job time on NumPy:
- Quansight (Matti, Nathan, Rohit, Mateusz, Melissa, me)
- NVIDIA (Sebastian - long-time maintainer, now ~2 years at NVIDIA)
- Intel (Raghuveer, contributor for several years, just gained commit
rights)
- Arm (Chris, contributor for ~1 year, just gained commit rights)
- I'm not sure if I should list Berkeley here too; folks at Berkeley
contributed a lot in the past, not sure if that was all grant-funded or if
there was unrestricted BIDS money to support NumPy.
3. unrestricted project funds, obtained from individual and corporate
(Tidelift (>$100k), Bloomberg ($10k)) donations, which support Sayed's
Developer in Residence position:
https://blog.scientific-python.org/numpy/fellowship-program/.
4. contracts for work on NumPy from clients of Quansight (and maybe other
companies, that is hard to know) that aligned with the NumPy project
roadmap. Noteworthy mentions here for the Sovereign Tech Fund, which
supported packaging-related work (
https://www.sovereigntechfund.de/tech/openblas), and the D. E. Shaw group,
which supported recent work on string ufuncs.

That said, *funding for packaging work is still quite challenging*. While
the above is an impressive list of funding, the vast majority of funders do
care about what they fund, and "keep the package installable" or "do
general maintenance work" typically doesn't do well in grant applications.
Funders have improved in this regard, and the ones mentioned in (1) above
do allow a general maintenance bucket which is some percentage of an
overall grant.

The people who are doing most of the work on packaging and wheel build CI
jobs for NumPy are Matti Picus, Andrew Nelson and myself. Andrew's time is
unfunded, for Matti and me I'd say a significant fraction of the time
working on this topic is also unfunded.

Does NumFOCUS support pay a maintainer to do it?


No, NumFOCUS does the admin for our project funds, but doesn't supply
funding to NumPy. It also doesn't structurally support any other open
source projects with direct funding, with the exception of its Small
Development Grant program - which is meant for smaller one-off projects
(amounts in the $2k - $10k range) rather than part-time or full-time
employment.


> Are Numpy maintainers adding new platform builds as part of their day
> jobs?


In general, no. This has alway

[Numpy-discussion] Re: Need help in numpy building from source on windows.

2024-02-26 Thread Ralf Gommers
On Mon, Feb 26, 2024 at 3:26 PM rajoraganesh--- via NumPy-Discussion <
numpy-discussion@python.org> wrote:

> detailed proble can be found at -
> https://stackoverflow.com/questions/78059816/issues-in-buildingnumpy-from-source-on-windows


Quick answers:
1. Please don't build 1.26.0, but 1.26.4. The CBLAS detection issue you are
hitting there has been made more robust in 1.26.2-3, so it should go away
2. Please feel free to open an issue on
https://github.com/numpy/numpy/issues for build issues like this. That's
better than the mailing list.

Cheers,
Ralf




> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@gmail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: JSON format for multi-dimensional data

2024-02-27 Thread Ralf Gommers
On Sun, Feb 25, 2024 at 12:34 AM  wrote:

>
> > Perhaps, like the Pandas package, it should live outside NumPy for a
> > while until some wider consensus could emerge.
>
> Regarding this initial remark, this is indeed a possible option but it
> depends on the answer to the question:
>
>   - does Numpy want to have a neutral JSON exchange format to exchange
> data with other frameworks (tabular, multidimensional or other)?
>

I'd say it's unlikely. There are a lot of data storage formats; NumPy has
support for almost none of them, and for the few that we do have support
for (e.g. CSV) the reason for having that inside of NumPy is mostly
historical. There are packages like Zarr, h5py, PyTables, scipy.io that
implement support for reading and writing NumPy arrays in a large number of
I/O formats. Typically there is no reason for such code to live inside
NumPy. I'd expect the same to be true for JSON.

That isn't to say that a new JSON-based storage format wouldn't be of
interest to NumPy users - they may very well need it. We do have docs that
mention popular I/O formats, and if yours gets popular we may want to add
it to those docs:
https://numpy.org/devdocs/user/how-to-io.html#write-or-read-large-arrays
(that could use more detail too).

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: builing numpy on windows

2024-02-27 Thread Ralf Gommers
On Wed, Feb 28, 2024 at 6:54 AM Ganesh Rajora via NumPy-Discussion <
numpy-discussion@python.org> wrote:

> Hi Team,
>
> I am Ganesh working for an MNC here in India and I am working on
> customised Python where I build set of python modules with python.
>
> I do not install them directly from web but I build each and everything
> from source code.  this is because security concerns at the organisation.
>
> In similar line I want to get some help from you on building numpy on
> windows, as I did not find any direct reference on how to do that on any of
> numpy's official web. I am facing lots of issues to build that, If you
> could help me out with it or could give me a right point of contact to
> discuss the issue would be great help.
>
> I have posted the issue here on stackoverflow -
>
> Issues in buildingnumpy from source on Windows
> 
>

Hi Ganesh. I answered your previous email two days ago, please have a look
at
https://mail.python.org/archives/list/numpy-discussion@python.org/thread/MVAS7DNLOLF6KKUI3WJJXLYEUMLWBZ7N/
.

Beyond that particular issue, the docs at
https://numpy.org/devdocs/building/index.html are up to date and have
specific info on Windows. You can also look at our Windows CI jobs for logs
to see exactly what tools and versions are used:
https://github.com/numpy/numpy/blob/main/.github/workflows/windows.yml.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: JSON format for multi-dimensional data

2024-02-28 Thread Ralf Gommers
On Tue, Feb 27, 2024 at 3:45 PM  wrote:

> Thanks Ralf,
>
> This answers my question about the absence of I/O Numpy format.
>
> There are three other points related to this format proposal:
>
> - integration of a semantic level above the number / character formats as
> for datetime (e.g. units, point / polygon, URI, email, IP, encoding...),
> - neutral format (platform independent) for multidimensional data
> including multi-variables, axes, indexes and metadata,
> - finally the conversion of tabular data into multi-dimensional data
> (dimension greater than 2) via a neutral format.
>
> Do these points interest Numpy or would this rather concern applications
> built on a Numpy base?
>

I think the latter.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Improved 2DFFT Approach

2024-02-29 Thread Ralf Gommers
On Wed, Feb 28, 2024 at 6:59 AM camrymanjr--- via NumPy-Discussion <
numpy-discussion@python.org> wrote:

> Good day!
>
> My name is Alexander Levin.
>
> My colleague and I did a project on optimisation of two-dimensional
> Fourier transform algorithm six months ago. We took your implementation of
> numpy fft2d as a unit of quality.
>

Thanks for sharing Alexander.


> In the course of our research we found out that initially mathematically
> the method uses a far from optimal algorithm. As you may know, your
> operation goes first by rows then by columns applying a one-dimensional
> transformation. After spending some time researching mathematical papers on
> this topic, we found and implemented a new approach, the so-called
> Cooley-Tukey butterfly, optimised by Russian mathematicians in 2016.
>
> The output is a completely new approach, where we immediately apply a
> two-dimensional operation and save a serious proportion of time on it. As a
> result, we wrote a C++ package for Python, using Cython as a wrapper. The
> result was the package and an article on Medium describing the process. On
> tests for matrices ranging in size from 2048x512 to 8192x8192, our
> algorithm outperformed the NumPy transformation by an average of 50% in
> time.
>

Did you also benchmark against `scipy.fft` by any chance? SciPy uses a
newer C++ version of PocketFFT. In NumPy we just switched over from the
older C code of PocketFFT to the newer C++ code:
https://github.com/numpy/numpy/pull/25536. That PR also solves the
upcasting that your Medium article touches on; we now have proper
float32/complex64 support. Note that SciPy also has a multi-threaded
version, which you can enable with the `workers` keyword.

It'd also be interesting perhaps to benchmark memory usage, since your
article touches on eliminating temporary arrays.

After discussing this matter with my colleague with whom we did the above
> development, we came to a common desire to share our results with you. Your
> company has been making a huge contribution to the IT community for almost
> 20 years on a pro bono basis. We share your philosophy and so we want to
> make the CS world a better place by providing you with our code to optimise
> the approach in an existing operation in your package.
>

Great to hear that. Note that we're not a company - more a loosely
organized team of mostly volunteers and some folks who get to work on NumPy
as part of their day job at a company or university.


> We would like to offer you our development version, though it'll need some
> minor improvements, we'd love to finish them collaboratively with you and
> hear your thoughts regarding what we have discovered and done so far. We'd
> be honored to help you and become NumPy contributors.
>

Performance improvements are always of interest, so if this is an
algorithmic improvement without loss of accuracy, then that sounds very
relevant. And more contributors are always welcome!

Cheers,
Ralf

I trust that our vision will resonate with you and that you will agree with
> us. I invite you to read our quick article about the process, which I have
> linked below, and our fully functioning package and share your views on our
> contribution to NumPy. We are willing to edit the code to fit your
> standards, as we believe that the best use of our work will be to
> contribute to the development of technology in the world.
>
> Thank you for your time. We look forward to your response and opinion.
>
> Medium Article about the process:
> https://medium.com/p/7963c3b2f3c9
> GitHub Source code Repository:
> https://github.com/2D-FFT-Project/2d-fft
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@googlemail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Updating NEP statuses, and accepting NEP 56

2024-03-03 Thread Ralf Gommers
Hi all,

Each NEP has a status, which should be indicative of the state that the
proposal is in, and that determines in what category it's shown on
https://numpy.org/neps/. We have been neglecting to update status on a fair
number of NEPs for a long time. I thought I'd fix that - see
https://github.com/numpy/numpy/pull/25920.

The first commit marks NEPs 30, 31, 37, and 47 as superseded (by NEP 56),
and NEP 56 as accepted, since it's done and 99.x% implemented. NEP 56,
which itself included the proposal to mark these other NEPs as superseded,
got one reply with a thumbs up for this at
https://mail.python.org/archives/list/numpy-discussion@python.org/thread/IM4E5JHVPA3QSZORKSVPPMLHU52OXEGB/#Z6AA5CL47NHBNEPTFWYOTSUVSRDGHYPN
.

The second commit updates the status of active process NEPs, marks NEP 21
as deferred, and NEP 38 as final:

- We marked process NEPs as Accepted/Final so far, but the intent of
policy/process NEPs that still apply and can be updated when we update our
development processes or project policies is that they are marked as
Active, so they show up under "Meta-NEPs".
- NEP 21 (oindex/vindex) is still considered perhaps a good idea, but
hasn't been touched in many years (it's from 2015), so mark it as Deferred.
- NEP 38 (SIMD) was finished several years ago, so mark it as Final.

Overall this should give a better picture of the current state of
development and thinking. There are only four NEPs left under Draft, and
NEP 50 (type promotion) and NEP (C API changes for 2.0) should be moving to
Accepted soon as well. I chose not to touch those, since for NEP 50 there
is still a fair bit in flight, and NEP 53 seems to need some updates for
what was actually implemented (at a high level, for example a compat header
rather than a separate library). But both are close to being completely
implemented as well, and will ship with NumPy 2.0, so we're pretty sure
that they're accepted by now.

Please comment here or on the PR if you have feedback on this.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: numpy 2.0.x has been branched.

2024-03-09 Thread Ralf Gommers
On Sat, Mar 9, 2024 at 2:03 AM Oscar Benjamin 
wrote:

> On Sat, 9 Mar 2024 at 00:44, Charles R Harris 
> wrote:
> >
> > About a month from now.
>
> What will happen about a month from now? It might seem obvious to you
> but I can interpret this in different ways.
>
> To be clear numpy 2.0 is expected to be released in full to the public
> in about one month's time from today?
>

Let me give the optimistic and pessimistic timelines. Optimistic:

- 2.0.0b1 later today
- 2.0.0rc1 (ABI stable) in 7-10 days
- 2.0.0 final release in 1 month

Pessimistic:

- 2.0.0b1 within a few days
- 2.0.0rc1 (ABI stable) in 2 weeks
- 2.0.0rc2 in 4 weeks
- 2.0.0rc3 in 6 weeks
- 2.0.0 final release in 8 weeks

For projects which have nontrivial usage of the NumPy API (and especially
if they also use the C API), I'd recommend:
1. Check whether things work with 2.0.0b1, ideally asap so if there is
anything we missed we can catch it before rc1. Perhaps do a pre-release of
your own package
2. Do a final release after 2.0.0rc1 - ideally as soon as possible after,
and definitely before the final 2.0.0 release

For (2), note that there are a ton of packages that do not have correct
upper bounds, so if you haven't done your own new release that is
compatible with both 2.0.0 and 1.26.x *before* 2.0.0 comes out, the users
of your project are likely to have a hard time.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Automatic Clipping of array to upper / lower bounds of dtype

2024-03-10 Thread Ralf Gommers
On Sat, Mar 9, 2024 at 11:23 PM Dom Grigonis  wrote:

> Hello,
>
> Can't find answer to this anywhere.
>
> What I would like is to automatically clip the values if they breach the
> bounds.
>
> I have done a simple clipping, and overwritten __iadd__, __isub__,
> __setitem__, …
>
> But I am wandering if there is a specified way to do this. Or maybe at
> least a centralised place exists to do such thing? E.g. Only 1 method to
> override?
>

That centralized method is `__array_wrap__`; a subclass that implements
`__array_wrap__` by applying `np.clip` and then returning self should do
this I think.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Automatic Clipping of array to upper / lower bounds of dtype

2024-03-10 Thread Ralf Gommers
On Sun, Mar 10, 2024 at 9:14 AM Dom Grigonis  wrote:

> Much thanks!
>
> Another related question while I am at it. It says clip is supposed to be
> faster than np.maximum(mp.minumum(arr, max), min). However:
>
> a = np.arange(100)%timeit a.clip(4, 20)# 8.48 µs%timeit 
> np.maximum(np.minimum(a, 20), 4)# 2.09 µs
>
> Is this expected?
>

Make sure that you're not benchmarking with very small arrays (2 us is on
the order of function call overhead) and that the timing are reproducible.
`clip` is more efficient:

>>> %timeit np.clip(a, 4, 20)
70 µs ± 304 ns per loop (mean ± std. dev. of 7 runs, 10,000 loops each)
>>> %timeit np.clip(a, 4, 20)
72.8 µs ± 161 ns per loop (mean ± std. dev. of 7 runs, 10,000 loops each)
>>> %timeit np.maximum(np.minimum(a, 20), 4)
742 µs ± 8.45 µs per loop (mean ± std. dev. of 7 runs, 1,000 loops each)

Ralf



>
> Regards,
> dg
>
>
> On 10 Mar 2024, at 09:59, Ralf Gommers  wrote:
>
>
>
> On Sat, Mar 9, 2024 at 11:23 PM Dom Grigonis 
> wrote:
>
>> Hello,
>>
>> Can't find answer to this anywhere.
>>
>> What I would like is to automatically clip the values if they breach the
>> bounds.
>>
>> I have done a simple clipping, and overwritten __iadd__, __isub__,
>> __setitem__, …
>>
>> But I am wandering if there is a specified way to do this. Or maybe at
>> least a centralised place exists to do such thing? E.g. Only 1 method to
>> override?
>>
>
> That centralized method is `__array_wrap__`; a subclass that implements
> `__array_wrap__` by applying `np.clip` and then returning self should do
> this I think.
>
> Cheers,
> Ralf
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: dom.grigo...@gmail.com
>
>
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@googlemail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Which generation of s390x CPU and how to change it?

2024-03-25 Thread Ralf Gommers
On Mon, Mar 25, 2024 at 2:24 PM Matěj Cepl  wrote:

> Hello,
>
> As a maintainer of Python packages for openSUSE/SUSE,
> I would like to ask for help with our bug
> https://bugzilla.suse.com/1221902. It seems to us that the latest
> version of NumPy suddenly requires z15 CPU generation, although
> it used to be OK with z13+ before, and unfortunately that is the
> level of the CPU support we standardize in openSUSE.
>
> Is it true? If so, when this change happened and why? How
> could we push numpy to work with older processors?


I think that is true.
https://numpy.org/neps/nep-0054-simd-cpp-highway.html#supported-features-targets
mentions support for Z14/Z15, and
https://github.com/numpy/numpy/pull/25563#issuecomment-1889798367 touches
on Z14 support having an issue. So I think Z14 is the minimum.


> Is there anything more to do than just switch of particular features in
> `--cpu-dispatch`?
>

I suspect not, assuming that that works. If Z13 support is an important use
case where the performance difference matters a lot, maybe you can bring up
getting partial or even full support back in NumPy. Most NumPy maintainers
won't know much/anything about s390x though, so that may require
contributions and be nontrivial.

Another, more feasible option would be to auto-disable the SIMD features,
emit a warning, and then continue rather than fail the NumPy build on Z13
by default.

Cheers,
Ralf




>
> Thank you for any reply,
>
> Matěj Cepl
>
> --
> http://matej.ceplovi.cz/blog/, @mcepl@floss.social
> GPG Finger: 3C76 A027 CA45 AD70 98B5  BC1D 7920 5802 880B C9D8
>
> I was quite depressed two weeks ago when I spent an afternoon at
> Brentano's Bookshop in New York and was looking at the kind of
> books most people read. Once you see that you lose all hope.
>   -- Friedrich August von Hayek
>
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@gmail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: numpy 2.0.x has been branched.

2024-03-28 Thread Ralf Gommers
On Tue, Mar 26, 2024 at 1:20 AM Charles R Harris 
wrote:

>
>
> On Mon, Mar 25, 2024 at 5:54 PM Oscar Benjamin 
> wrote:
>
>> On Sat, 9 Mar 2024 at 10:16, Ralf Gommers  wrote:
>> >
>> > On Sat, Mar 9, 2024 at 2:03 AM Oscar Benjamin <
>> oscar.j.benja...@gmail.com> wrote:
>> >>
>> >> On Sat, 9 Mar 2024 at 00:44, Charles R Harris <
>> charlesr.har...@gmail.com> wrote:
>> >> >
>> >> > About a month from now.
>> >>
>> >> What will happen about a month from now? It might seem obvious to you
>> >> but I can interpret this in different ways.
>> >>
>> >> To be clear numpy 2.0 is expected to be released in full to the public
>> >> in about one month's time from today?
>> >
>> > Let me give the optimistic and pessimistic timelines. Optimistic:
>> >
>> > - 2.0.0b1 later today
>> > - 2.0.0rc1 (ABI stable) in 7-10 days
>> > - 2.0.0 final release in 1 month
>> >
>> > Pessimistic:
>> >
>> > - 2.0.0b1 within a few days
>> > - 2.0.0rc1 (ABI stable) in 2 weeks
>> > - 2.0.0rc2 in 4 weeks
>> > - 2.0.0rc3 in 6 weeks
>> > - 2.0.0 final release in 8 weeks
>>
>> Thanks Ralf and Chuck. Sorry, I meant to reply to this earlier but got
>> distracted with other things.
>>
>> We are now 16 days into the future so NumPy 2.0 would be 2 weeks time
>> for the optimistic timescale.
>>
>> I assume that now the beta release is out the intention is no further
>> breaking changes. Then if SymPy master is currently compatible with
>> NumPy 2.0.0b1 a good time for a SymPy release with accumulated fixes
>> is... ASAP!
>>
>> Presumably now that NumPy 2.0 is branched downstream projects should
>> test in CI against the prereleases (pip install --pre numpy) rather
>> than the nightly wheels. (SymPy does not usually test against the
>> nightly wheels. I added that for NumPy 2.0 but maybe we should keep
>> it...)
>>
>>
> The rc1 release is now waiting on pybind11, so there is an (uncertain)
> delay.
>

Pybind11 2.12.0 was released earlier today, so everything is unblocked. We
should see a numpy 2.0.0rc1 very soon -
https://github.com/numpy/numpy/pull/26149 is the one PR we still need I
believe.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: numpy 2.0.x has been branched.

2024-03-29 Thread Ralf Gommers
On Fri, Mar 29, 2024 at 2:07 PM Peter Hawkins 
wrote:

> It looks like the pybind11 release is now done (
> https://github.com/pybind/pybind11/releases/tag/v2.12.0)? Any more
> blockers?
>

No more blockers - CI is running on the last backport that we need I
believe, so it's very close. Hours to max a few days away I think.


> (We're eagerly awaiting -rc1 so we can release new wheels for our
> project...)
>

Great!

Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Moving the weekly traige/community meetings

2024-04-08 Thread Ralf Gommers
On Mon, Apr 8, 2024 at 5:37 PM Nathan  wrote:

> That time work for me, I have a conflict with the old time an hour earlier
> than the current time so hopefully that works for everyone.
>

One hour later works for me too.

Cheers,
Ralf

On Sun, Apr 7, 2024 at 8:34 PM Matti Picus  wrote:
>
>> Could we move the weekly community/triage meetings one hour later? Some
>> participants have a permanent conflict, and the current time is
>> inconvenient for my current time zone.
>>
>> Matti
>>
>> ___
>> NumPy-Discussion mailing list -- numpy-discussion@python.org
>> To unsubscribe send an email to numpy-discussion-le...@python.org
>> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
>> Member address: nathan12...@gmail.com
>>
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@gmail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Making show_runtime and show_config enable file output

2024-04-09 Thread Ralf Gommers
On Mon, Apr 8, 2024 at 9:42 PM Matan Addam  wrote:

> Hello,
>
> I find the information printed by the above mentioned functions to be
> useful for understanding performance context on installed machines, as well
> as variability across machines when troubleshooting. How would the
> maintainers view a pull request adding to those functions the option of
> directing their output to a file?
>
> These functions, at least as built on my machine using numpy 1.24.4, using
> python's print and pprint for their outputs, both of which functions
> allowing arguments for redirecting their output to a file. Adding this
> option may enable recording the information to files without redirecting
> all of stdout.
>

`show_config` has a `mode` keyword that you can use to get back a
dictionary instead of printing to stdout. So from there it should be quite
straightforward to write the data in the returned dictionary in JSON or
whatever format you prefer.

`show_runtime` doesn't have the same keyword yet. Adding that should
address your issue though, in a way that's probably better than adding file
writing to these functions. Maybe open a PR for that instead?

Cheers,
Ralf



>
> What would your position be?
>
> Or are they actually a facade built upon installation by dynamically
> generated code, which yields different function implementations on
> different platforms?
>
> It could be otherwise nice to provide access to a dict of the data for the
> more general purpose, which would enable all desiderata of interest
> leveraging this information.
>
> Kind regards,
> Matan
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@googlemail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Making show_runtime and show_config enable file output

2024-04-10 Thread Ralf Gommers
On Wed, Apr 10, 2024 at 4:18 PM Ganesh Kathiresan 
wrote:

> Thanks for the input, I have raised a PR:
> https://github.com/numpy/numpy/pull/26255. I'll address the UT issues
> soon. Let me know if this is what was required.
>

Thanks Ganesh!
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Please consider dropping Python 3.9 support for Numpy 2.0

2024-05-06 Thread Ralf Gommers
On Mon, May 6, 2024 at 10:42 PM Oscar Benjamin 
wrote:

> On Mon, 6 May 2024 at 19:59, Aaron Meurer  wrote:
> >
> > On Mon, May 6, 2024 at 6:34 AM Mark Harfouche 
> wrote:
> > >
> > > I'm asking that you let Python 3.9 support disappear with 1.26, and
> not "drop a final version" before you decide to move on with 3.10+ only.
> >
> > I don't understand NumPy supporting Python 3.9 means you have to also.
> > A downstream dependent only has to support at most the versions you
> > do. If NumPy dropped Python 3.9 but you wanted to keep it that would
> > be a problem, but the reverse isn't because scikit-image depends on
> > NumPy, not the other way around.
>
> A downstream package needs to provide updates for all the same Python
> versions as its dependencies because otherwise e.g. a NumPy 2.0
> release for CPython 3.9 breaks dependent packages that no longer
> support 3.9. Those projects then need to add back support for older
> Python versions at the same time as putting out an urgent
> compatibility release. Perhaps usually this is not such an issue but
> particularly for an intentionally compatibility breaking release
> sending it out to more than the usual range of Python versions is not
> helpful for downstream.
>

I think this is an important argument indeed. The initial "follow SPEC 0"
is not really relevant, but let's focus on the practical issue at hand.
When a user types `pip install numpy scikit-image` in a Python 3.9
environment, where numpy 2.0 would support 3.9 and scikit-image 0.23 does
not, then the end result will be that numpy 2.0 and scikit-image 0.22
(incompatible!) will be installed. This is undesirable.

This problem could have been avoided by proper use of upper bounds.
Scikit-image 0.22 not including a `numpy<2.0` upper bound is a bug in
scikit-image (definitely for ABI reasons, and arguably also for API
reasons). It would really be useful if downstream packages started to take
adding upper bounds correctly more seriously. E.g., SciPy has always done
it right, so the failure mode that this thread is about doesn't exist for
SciPy. That said, this ship has sailed for 2.0 - most packages don't have
upper bounds in some or all of their recent releases.

So, I think I'm in favor of dropping Python 3.9 support after all to
prevent problems. It is late in the game, but I do see that we're going to
cause problems for packages that have already dropped 3.9, and I don't yet
see an issue with dropping 3.9 for numpy 2.0 now. Does anyone see a
potential failure mode if we go that way?

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Please consider dropping Python 3.9 support for Numpy 2.0

2024-05-07 Thread Ralf Gommers
On Mon, May 6, 2024 at 11:43 PM Aaron Meurer  wrote:

> On Mon, May 6, 2024 at 3:05 PM Ralf Gommers 
> wrote:
> >
>
> > So, I think I'm in favor of dropping Python 3.9 support after all to
> prevent problems. It is late in the game, but I do see that we're going to
> cause problems for packages that have already dropped 3.9, and I don't yet
> see an issue with dropping 3.9 for numpy 2.0 now. Does anyone see a
> potential failure mode if we go that way?
>
> Presumably dropping 3.9 support at this point would not mean removing
> anything that would actually break NumPy in Python 3.9. It would just
> mean adding the python_requires metadata and not building a 3.9 wheel.
> So if someone really needs a 3.9-compatible NumPy 2.0 they could build
> a wheel manually.
>
> I'm assuming the python_requires metadata is required though since
> otherwise pip would try to build a wheel from source.
>

Yes indeed, correct on all points.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Please consider dropping Python 3.9 support for Numpy 2.0

2024-05-07 Thread Ralf Gommers
On Tue, May 7, 2024 at 7:48 AM Juan Nunez-Iglesias  wrote:

> On Tue, 7 May 2024, at 7:04 AM, Ralf Gommers wrote:
>
> This problem could have been avoided by proper use of upper bounds.
> Scikit-image 0.22 not including a `numpy<2.0` upper bound is a bug in
> scikit-image (definitely for ABI reasons, and arguably also for API
> reasons). It would really be useful if downstream packages started to take
> adding upper bounds correctly more seriously. E.g., SciPy has always done
> it right, so the failure mode that this thread is about doesn't exist for
> SciPy. That said, this ship has sailed for 2.0 - most packages don't have
> upper bounds in some or all of their recent releases.
>
>
> I don't think this is a downstream problem,
>

This is definitely a bug in scikit-image (see below), that is really not up
for discussion here. If you use the NumPy C API, you MUST add a constraint
to stay below the next major version.


> I think this is a "PyPI has immutable metadata" problem.
>

Yes, I'm very much aware:
https://pypackaging-native.github.io/key-issues/pypi_metadata_handling/#impact-of-immutable-metadata.
This just is what it is though, we have to live within the constraints of
how PyPI is designed.


> I'm a big fan of Henry Schreiner's "Should You Use Upper Bound Version
> Constraints" <https://iscinumpy.dev/post/bound-version-constraints/>,
> where he argues convincingly that the answer is almost always no.
>

Emphasis on "almost". Unfortunately, too many maintainers took away the
wrong message from that blog post - probably because it was written in
frustration at Poetry *defaulting* to adding upper bounds, and hence is a
bit too strong on the arguments against. The point was "default to adding
no bounds, only add them if you must" (because on average they do more harm
than good). Translating that to "never add any upper bounds" is wrong and
harmful.

To quote Henry himself from another recent discussion where a maintainer
incorrectly used this blog post to argue against upper bounds (
https://github.com/sympy/sympy/issues/26273#issuecomment-2028192560):

*"You are correct, the post is not meant to say you can't upper cap (except
the Python version, which doesn't even work correctly if you upper cap) if
you've thought about it and have a reason to. It does increase friction,
but if there's a high chance the next version(s) will break you and you are
willing to make releases shortly after the dependency does & keep the cap
up to date, go ahead and cap."*

In this case, it's not just a high chance that things will break, it's
guaranteed to break. NumPy does an import check for ABI mismatch, and will
always raise an exception if you build against 1.x and run with 2.x (or 2.x
and 3.x, etc.). So you must add `<2` when building against numpy 1.x, and
`<3` when building against 2.x. Leaving out the upper bound achieves
exactly nothing except for making things break for the users of your
package.

Dropping Py 3.9 will fix the issue for a subset of users, but certainly not
> all users. Someone who pip installs scikit-image==0.22 on Py 3.10 will have
> a broken install. But importantly, they will be able to fix it in user
> space.
>

Yes, this cannot be fixed on the NumPy side either way. But that's a
separate issue. There are three cases we have to think about:

1. Everything unconstrained: `pip install scikit-image` or `pip install
scikit-image numpy`
2. Everything constrained: `pip install scikit-image==0.22 numpy==1.26`  (+
everything else, typically used through a lock file)
3. A mix of constrained and unconstrained: `pip install scikit-image==0.22`
or `pip install scikit-image numpy=2.0` or `pip install scikit-image==0.22
numpy`

(2) is typically robust since lock files are usually generated from
already-working envs. Unless the list of packages is incomplete, and then
it may turn into (3).

(3) is up to the authors of each package, there's nothing we can do in
numpy to make `pip install scikit-image==0.22` work if that version of
scikit-image depends on an unconstrained numpy version.

(1) is what this thread, and the rollout plans for NumPy 2.0, are concerned
about. If we drop Python 3.9, we make that work more reliably. If we don't,
then on Python 3.9 a plain `pip install scikit-image` will be broken.


>
> At any rate, it's not like NumPy (or SciPy, or scikit-image) don't change
> APIs over several minor versions.
>

We have extensive docs on this, and a documented backwards compatibility
policy that we try hard to adhere to:
https://numpy.org/neps/nep-0023-backwards-compatibility.html
https://numpy.org/devdocs/dev/depending_on_numpy.html#runtime-dependency-version-ranges

I'm honestly a little tired of maintainers just ignoring that and
misunderstand

[Numpy-discussion] Re: Please consider dropping Python 3.9 support for Numpy 2.0

2024-05-07 Thread Ralf Gommers
On Tue, May 7, 2024 at 11:44 AM Gael Varoquaux <
gael.varoqu...@normalesup.org> wrote:

> On Tue, May 07, 2024 at 11:31:02AM +0200, Ralf Gommers wrote:
> > make `pip install scikit-image==0.22` work if that version of
> scikit-image depends on an unconstrained numpy version.
>
> Would an option be for the scikit-image maintainers to release a version
> of scikit-image 0.22 (like 0.22.1) with a constraint numpy version?
>

Yes indeed, that would fix this (not for 0.21 of course, but there's only
so much that can be done here).

Doing exactly that is also the first point in the guidance in our 2.0
announcements issue: https://github.com/numpy/numpy/issues/24300. And I
asked specifically for scikit-image to do this months ago:
https://github.com/scikit-image/scikit-image/issues/7282#issuecomment-1885659412
.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Please consider dropping Python 3.9 support for Numpy 2.0

2024-05-10 Thread Ralf Gommers
On Thu, May 9, 2024 at 12:28 AM Thomas Caswell  wrote:

> I think the spirit of NEP 29 is to pick your supported Python's when you
> pick a target release date and you should then stick to it (to avoid "we
> delayed so long we are over a cliff" decisions like this one).
>

That's true I believe.


> We did NEP29 around the same time that Python went from 18 to 12 month
> releases (my memory is that the cadence change was being considered but not
> set).  If Python is targeting an 18mo release cycle, then a 36mo window +
> 2mo delayed Python release could land you in only supporting one version of
> Python which seems too few (again, my memory could be a bit off).  I was
> not at the meeting where SPEC 0 was discussed,
>

Same here, I wasn't at that meeting so am unsure. I only got added to SPEC
0 as a co-author at the end because it's mostly a copy of NEP 29. "change
in release cycle" is probably right as the motivation though.


> but I suspect that the narrowing is to better align with an integer number
> of Python releases while always hitting at least 2 versions of Python.
>

At least 3, right? 36 months is always 3 minor versions.



> 
>
> It is worth considering that CPython has both a concept of "bugfix" (which
> is 18mo) and "security" (which is 42mo and source-only) support.  It is
> arguable that by having "new feature" support and binary release
> targeting a given version of Python for 36mo we are supporting a given
> minor version of Python longer than upstream [speaking for myself I would
> sign onto a proposal to do security release against the last version of our
> libraries for all versions of Python that still have security support].
>
> 
>
> > So only 30% of users have Python 3.10+ or newer. Most smaller or newer
> projects can more than double their user base by supporting  3.8+. I could
> even argue that 3.7+ is still helpful for a new project. Once a library is
> large and stable, then it can go higher, even 3.13+ and not hurt anyone
> unless there's a major development.
>
> I have the exact opposite view of Henry here, I think for new projects you
> should be very aggressive and only support the newest Python version as you
> don't have any users (so do not have to be responsible yet)!
>

I think what you do as a new project author totally depends on your goals
and priorities. Either way, the argument that only so few users "have a new
Python" seems a bit off-target. That's not how people think - they look for
new packages to use when they need functionality, and once they find
something that fits their needs, they make it work. Nor are the statistics
reflective of usage needs, especially for manylinux - it's more that
production deployments stay on the same version of Python for their own
lifetime I suspect. But such deployments also pin all their dependencies,
so they're irrelevant to new versions/projects.

It gets ever-easier to install new Python versions, with pyenv/conda/etc.
The "my single Python install comes from python.org and I'm using the same
one because I am afraid to upgrade" is much less of an issue than it was 10
years ago. And it's caused mostly by users having knowledge gaps. So yes it
can be a pain for them, but they'll have to learn at some point anyway.
Same for "my old HPC cluster uses ..." - it's often an even older Python
anyway, and you'll have tons of reasons why you don't want your
cluster-installed stack - learn to use Spack or Conda, and you'll be much
happier in the long run.

---

Back to the topic of dropping 3.9 here: there seem to be some minor
concerns. Tom is right about the spirit of NEP29/SPEC0. And Sebastian is
partially right I think: not about `pip install scipy-image==0.22`, because
that should be picking 0.22.1 plus an older numpy; but `==0.22.0` instead
cannot be fixed anyway, and that's perhaps a more common way of spelling an
`==` constraint. So the last-minute dropping will at most have a marginally
useful impact.

Since Mark has started working on doing a single release of scikit-image
supporting Python 3.9 (
https://github.com/scikit-image/scikit-image/pull/7412), perhaps we can
close the book on this request now, and decide to not change anything?
I.e., we do release with 3.9 support as planned.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Please consider dropping Python 3.9 support for Numpy 2.0

2024-05-13 Thread Ralf Gommers
On Fri, May 10, 2024 at 11:28 PM Brigitta Sipőcz <
b.sipocz+numpyl...@gmail.com> wrote:

> Hi,
>
> Is there any way to know if other large libraries hasn't set an upper pin
> in their last release but since then dropped python version support?
>

This should be doable with either the PyPI JSON API or BigQuery:
https://warehouse.pypa.io/api-reference/json.html
https://packaging.python.org/en/latest/guides/analyzing-pypi-package-downloads/

If someone would want to attempt this, it may be a useful exercise to try
and catch potential issues and contact the maintainers of potentially
affected packages. Help is very much welcome.

This may also be a good place for a thank you to Sebastian, John Kirkham,
Clément Robert and @h-vetinari, who have been following up with many
packages in addition to the ones they maintain themselves, filing/resolving
issues, and helping with the 2.0 migration.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Please consider dropping Python 3.9 support for Numpy 2.0

2024-05-13 Thread Ralf Gommers
On Sun, May 12, 2024 at 8:39 PM Gael Varoquaux <
gael.varoqu...@normalesup.org> wrote:

> On Fri, May 10, 2024 at 04:42:49PM +0200, Ralf Gommers wrote:
> > It gets ever-easier to install new Python versions, with
> pyenv/conda/etc. The "my single Python install comes from python.org and
> I'm using the same one because I am afraid to upgrade" is much less of an
> issue than it was 10 years ago. And it's caused mostly by users having
> knowledge gaps. So yes it can be a pain for them, but they'll have to learn
> at some point anyway. Same for "my old HPC cluster uses ..." - it's often
> an even older Python anyway, and you'll have tons of reasons why you don't
> want your cluster-installed stack - learn to use Spack or Conda, and you'll
> be much happier in the long run.
>
> IMHO the view that its a tooling/knowledge gap problem is a bit
> disconnected of users. I meet many users who either
>
> 1. cannot control the Python version they run, or even the base
> environment, because of company culture (decisions at company level on
> these constraints). Maybe upper management in completely misguided here,
> but then upper management must be targeted, and it is not clear at all that
> constraints on packages is the right way to do it, as they typically never
> run any code.
>
> 2. have environments with many dependencies that get in gridlocks of
> dependencies that cannot be mutually satisfied. For my own work it becomes
> increasingly frequent for me to spawn multiple environments that then talk
> to each others (eg via files) to work around these problems.
>
>
> Problem 1 is probably something that user organizations should change, and
> we need to understand why they lock Python versions.


I don't think the problem is what you think it is. It's typically general
IT policies, meaning you cannot do things like download and install random
executables in places like C:\. This is quite common, and it's mostly some
Python installers (python.org ones in particular) falling under a general
policy. It's very rare to have a Python-tooling-specific block - I don't
think I have ever seen something like that. If you can run `pip` and
install to for example locations under your home dir, then you almost
always can also run pyenv/conda/mamba & co to install different Python
versions. And if you cannot run `pip`, there is nothing to discuss w.r.t.
release policy, new package versions don't matter.

It could be a QA issue, and this might reveal both a lack of good practices
> for QA (ie testing) but also the instability of the ecosystems that creates
> a fear of upgrade. We should not be too fast in dismissing these
> organizations as strife with bad practices that could easily be changed, as
> even tech-savy organizations (such as Google, I believe) run into these
> problems.
>

Google (along with many other large tech companies) uses a monorepo. That
means that there is only a single version of any package. This can be a
great strategy once you're large enough; it has costs too, but certainly
shouldn't be qualified as a bad or uninformed strategy. A common support
window like SPEC 0 is actually helpful there.


> Problem 2 is not a problem solvable by users: it comes from the fact that
> dependency version windows are too narrow. Without broad windows on
> dependencies, the more dependencies one has, the more one gets into an
> impossible situation. For this last reason, I strongly advocate that
> packages, in particular core packages, try hard to be as permissible as
> reasonably possible on dependencies.
>

You don't give any details, but the one common place I'm aware of where
narrow support windows are common is when deep learning / CUDA are used.
Many deep learning packages use something like a 3-6 month deprecation
policy, and a 6-12 month support window. And then often depend on 1 or max
2 versions of CUDA and cuDNN. This makes it very challenging to work with
more than one deep learning framework or use them in combination with other
complex packages like Ray. So yes, when you have lots of dependencies like
that, you will have a hard time.

Given you work with deep learning, I'll assume that that's the issue you
are seeing. I don't think we have evidence that a 3 year support window is
too narrow, and we do understand the cost of longer windows. I'll note also
that SPEC 0 says a minimum of 2 years after initial release (so back in
time), but due to backwards compatibility policies the support also goes at
least 1-1.5 years forward in time. So there's a >=3 year window of support
for feature releases of core packages in the scientific Python ecosystem.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Build NumPy with Debugging Symbols with Meson

2024-05-16 Thread Ralf Gommers
On Wed, May 15, 2024 at 10:56 PM Robert McLeod  wrote:

> Hi everyone,
>
> Is there a gist or similar guide anywhere on the steps required to build
> NumPy with debugging symbols on the Windows platform using the new Meson
> build system? It seems a bit difficult because NumPy seems to expect a
> `conda`-like file structure whereas if you are cloning `cpython` directly
> it is different. In particular `cpython` puts all the Windows files under
> the `PC` and `PCBuild` subdirectories. Also meson doesn't seem to have a
> means to call `python_d.exe` directly which may be causing problems. I've
> ended up building both `python.exe` and `python_d.exe` but run into some
> problem finding Python in `meson.build`.
>

I can think of two ways to point Meson at a custom `python_d.exe` Python
executable: using pkg-config, and using a machine file (
https://mesonbuild.com/Machine-files.html#cross-and-native-file-reference).
meson-python uses the latter approach; you create a file `native-file.ini`
with contents:

```
[binaries]
python = 'C:/path/to/python_d.exe'  # (not sure about \ vs. / here)
```

You can install pkg-config with Chocolatey for example. The Windows build
should produce a .pc file that you can point pkg-config to by setting
PKG_CONFIG_PATH. See how we use it in CI here:
https://github.com/numpy/numpy/blob/main/.github/workflows/windows.yml#L44

I'd try the machine file approach first, I'd expect it to be enough. Not
sure though, I don't think I've seen anyone try with a Windows debug build.


> This describes my efforts to date:
>
> # Build CPython with Debugging Symbols on Windows
>
> ```shell
> git clone https://github.com/python/cpython.git
> cd cpython
> git switch 3.11
> PCbuild\get_externals.bat
> # Build the debug version `python_d.exe`
> PCbuild\build.bat -e -d
> # Meson calls `python.exe` and doesn't seem to have an argument to change
> this.
> # We could either build it, or create a symlink
> PCbuild\build.bat
> # ln -s PCbuild/amd64/python_d.exe PCbuild/amd64/python.exe
> ```
>
> The built Python will then be located in
> `/cpython/PCBuild/amd64/python_d.exe`.
>
> ```batch
> set PREFIX=C:/Users/Robert/dev/cpython
> set PATH=%PREFIX%;%PREFIX%/PCBuild/amd64;%PREFIX%/Scripts;%PATH%
> ```
>
> Next we have to install pip:
> https://docs.python.org/3/library/ensurepip.html,
> meson, and cython.
>
> ```shell
> python_d -m ensurepip
> python_d -mpip install meson meson-python ninja cython
> ```
>
> NOTE: produces `pip3.exe`, not `pip`.
>
> # Build NumPy with debug Python
>
> ```shell
> git clone https://github.com/numpy/numpy.git
> cd numpy
> git switch maintenance/1.26.x
> git submodule update --init
> ```
>
> https://mesonbuild.com/meson-python/how-to-guides/debug-builds.html
>
> Next try and build with meson/ninja:
>
> ```shell
> mkdir build-dbg
> cd build-dbg
> meson .. setup --buildtype=debug
> --includedir=C:/Users/Robert/dev/cpython/PC
> --libdir=C:/Users/Robert/dev/cpython/PCbuild/amd64
> ninja
> ninja install
> ```
>

Note that once you get past the "find python_d.exe" hurdle, you will need
to change the `meson` invocation from plain `meson xxx` to `python.exe
./vendored-meson/meson/meson.py xxx`, because our custom extensions in
https://github.com/numpy/numpy/tree/main/vendored-meson are needed for BLAS
and SIMD code.

Cheers,
Ralf



> `meson .. setup <...>` fails and complains that,
>
> '''
> Run-time dependency python found: NO (tried sysconfig)
>
>   ..\meson.build:41:12: ERROR: Python dependency not found
> '''
>
> which corresponds to:
>
> ```meson.build
> py = import('python').find_installation(pure: false)
> py_dep = py.dependency()
> ```
>
> I tried also editing `pyproject.toml` to add the section:
>
> ```toml
> [tool.meson-python.args]
> setup = ['-Dbuildtype=debug',
>  '-Dincludedir=C:/Users/Robert/dev/cpython/PC',
>  '-Dlibdir=C:/Users/Robert/dev/cpython/PCbuild/amd64']
> ```
>
> and then build NumPy with pip using the debug python build:
>
> ```shell
> python_d -mpip install . --no-build-isolation -Cbuild-dir=build-dbg
> ```
>
> But it fails in the same fashion. Searching for issues I find this one but
> it's likely in this case something related to the debug Python build is the
> problem.
>
> https://github.com/mesonbuild/meson/issues/12547
>
> Meson installed version is 1.4.0. Any advice would be appreciated. I'm
> happy to write a gist if I can get it working.
>
> --
> Robert McLeod
> robbmcl...@gmail.com
> robert.mcl...@hitachi-hightech.com
>
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@googlemail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-

[Numpy-discussion] Re: ERROR: Module "features" does not exist

2024-05-16 Thread Ralf Gommers
On Thu, May 16, 2024 at 2:18 PM Thomas Mansencal 
wrote:

> Hello,
>
> We have a custom and unusual build environment that is making things a bit
> difficult to transition from the old Setuptools based build process to
> Meson. I'm currently blocked at the CPU feature set detection:
>
> ```
> [tasks] Build command: call vcvarsall amd64 && cd /d "[...]\numpy" && pip
> install -e . --no-build-isolation
> **
> ** Visual Studio 2022 Developer Command Prompt v17.9.6
> ** Copyright (c) 2022 Microsoft Corporation
> **
> [vcvarsall.bat] Environment initialized for: 'x64'
> Defaulting to user installation because normal site-packages is not
> writeable
> Obtaining file:///[...]/numpy
>   Checking if build backend supports build_editable: started
>   Checking if build backend supports build_editable: finished with status
> 'done'
>   Preparing editable metadata (pyproject.toml): started
>   Preparing editable metadata (pyproject.toml): finished with status
> 'error'
>   error: subprocess-exited-with-error
>
>   × Preparing editable metadata (pyproject.toml) did not run successfully.
>   │ exit code: 1
>   ╰─> [24 lines of output]
>   + [...]\meson\Scripts\meson.exe setup [...]\numpy
> [...]\numpy\build\cp311 -Dbuildtype=release -Db_ndebug=if-release
> -Db_vscrt=md
> --native-file=[...]\numpy\build\cp311\meson-python-native-file.ini
>   The Meson build system
>   Version: 1.4.0
>   Source dir: [...]\numpy
>   Build dir: [...]\numpy\build\cp311
>   Build type: native build
>   Project name: NumPy
>   Project version: 1.26.5
>   C compiler for the host machine: cl (msvc 19.39.33523 "Microsoft (R)
> C/C++ Optimizing Compiler Version 19.39.33523 for x64")
>   C linker for the host machine: link link 14.39.33523.0
>   C++ compiler for the host machine: cl (msvc 19.39.33523 "Microsoft
> (R) C/C++ Optimizing Compiler Version 19.39.33523 for x64")
>   C++ linker for the host machine: link link 14.39.33523.0
>   Cython compiler for the host machine:
> [...]\python-3.11\cython\bin\cython.bat (cython 3.0.9)
>   Host machine cpu family: x86_64
>   Host machine cpu: x86_64
>   Program python found: YES (C:\Program Files\Python311\python.exe)
>   Run-time dependency python found: YES 3.11
>   Has header "Python.h" with dependency python: YES
>   Compiler for C supports arguments -fno-strict-aliasing: NO
>   Message: [...]\numpy
>
>   ..\..\meson_cpu\x86\meson.build:3:15: ERROR: Module "features" does
> not exist
>
>   A full log can be found at
> [...]\numpy\build\cp311\meson-logs\meson-log.txt
>   [end of output]
>
>   note: This error originates from a subprocess, and is likely not a
> problem with pip.
> error: metadata-generation-failed
>
> × Encountered error while generating package metadata.
> ╰─> See above for output.
>
> note: This is an issue with the package mentioned above, not pip.
> hint: See above for details.
> 14:27:24 ERRORBuildError: The custom build system failed.
> ```
>
> So the build fails whilst trying to import the "features" module here:
> https://github.com/numpy/numpy/blob/main/meson_cpu/x86/meson.build#L2C1-L3C1
>
> Where is that module meant to be located?
>

That's part of our vendored-meson. You are invoking plain `meson` somehow
(as can be seen from the `Version: 1.4.0` logging output), which doesn't
have that. Invoke `python ./vendored-meson/meson/meson.py` instead.

Cheers,
Ralf



>
> Cheers,
>
> Thomas
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@googlemail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] ANN: NumPy 2.0 release date: 16 June 2024

2024-05-23 Thread Ralf Gommers
Hi all,

In yesterday's community meeting we discussed the 2.0 release schedule, and
concluded that we are far enough along to pick a date for the final 2.0
release. That date will be June 16 (barring any major last-minute blockers
appearing of course).

A quick timeline:
- `maintenance/2.0.x` branch creation: March 8
- beta 1: March 12
- release candidate 1: March 30
- release candidate 2: May 12
- as of today, 37 out of 58 of the downstream packages we're currently
tracking at https://github.com/numpy/numpy/issues/26191 have done a
2.0-compatible release (including all of the ones lowest in the dependency
tree that we were tracking early on), and many of the others have a release
planned or 2.0 fixes merged or close to being complete.

We've had 2.5 months of the release process, and very few blocking issues
have been opened over the last weeks. Another 3.5 weeks should be enough
for most package authors who still have work to do for their own releases,
as well as for end users to test their code and either fix potential issues
or temporarily add `numpy<2` constraints in the relevant places.

In addition to this announcement email and the notice on
https://github.com/numpy/numpy/issues/24300, we will update the numpy.org
website and use our Twitter and LinkedIn project accounts to announce the
release date.

Thank you to everyone who helped us get so close to the finish line!

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] updating the NumPy roadmap

2024-05-24 Thread Ralf Gommers
Hi all,

Now that the dust has settled on what we're including in the NumPy 2.0
release, it felt like a good time to update the project roadmap. After a
few discussions with other maintainers I opened
https://github.com/numpy/numpy/pull/26505. Part of it is a regular
maintenance update: remove what we've implemented or decided not to do, and
rewrite existing roadmap entries to reflect their current state.

The other part is adding some new items. In particular:
1. Under the documentation section, add that we plan to make all example
code interactive via jupyterlite-sphinx
2. Under the "platform support" section, add that we aim to start
supporting free-threaded CPython, and plan to better define platform
support tiers
3. A new section " binary size reduction"
4. A new section "NumPy 2.0 stabilization & downstream usage"
5. A new section "Security" (focused on supply chain security)

If you are interested in what will end up on the roadmap, please do review
that PR. For major topics, this thread can be used. New ideas are of course
welcome, in particular from NumPy team members and from contributors who
plan to work on something large enough that it should be represented on the
roadmap.

As a reminder: the roadmap has no dates, and not everything on it is
guaranteed to materialize. The first sentences of the roadmap explain the
purposes: "This is a live snapshot of tasks and features we will be
investing resources in. It may be used to encourage and inspire developers
and to search for funding."

We plan to keep the PR open for at least 10 days from now, or until
discussion settles.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Policy on AI-generated code

2024-07-04 Thread Ralf Gommers
On Thu, Jul 4, 2024 at 12:55 PM Matthew Brett 
wrote:

> Sorry - reposting from my subscribed address:
>
> Hi,
>
> Sorry to top-post!  But - I wanted to bring the discussion back to
> licensing.  I have great sympathy for the ecological and code-quality
> concerns, but licensing is a separate question, and, it seems to me,
> an urgent question.
>
> Imagine I asked some AI to give me code to replicate a particular
> algorithm A.
>
> It is perfectly possible that the AI will largely or completely
> reproduce some existing GPL code for A, from its training data.  There
> is no way that I could know that the AI has done that without some
> substantial research.  Surely, this is a license violation of the GPL
> code?   Let's say we accept that code.  Others pick up the code and
> modify it for other algorithms.  The code-base gets infected with GPL
> code, in a way that will make it very difficult to disentangle.
>

This is a question that's topical for all of open source, and usages of
CoPilot & co. We're not going to come to any insightful answer here that is
specific to NumPy. There's a ton of discussion in a lot of places; someone
needs to research/summarize that to move this forward. Debating it from
scratch here is unlikely to yield new arguments imho.

I agree with Rohit's: "it is probably hopeless to enforce a ban on AI
generated content". There are good ways to use AI code assistant tools and
bad ones; we in general cannot know whether AI tools were used at all by a
contributor (just like we can't know whether something was copied from
Stack Overflow), nor whether when it's done the content is derived enough
to fall under some other license. The best we can do here is add a warning
to the contributing docs and PR template about this, saying the contributor
needs to be the author so copied or AI-generated content needs to not
contain things that are complex enough to be copyrightable (none of the
linked PRs come close to this threshold).


> Have we consulted a copyright lawyer on this?   Specifically, have we
> consulted someone who advocates the GPL?
>

Not that I know of.

Cheers,
Ralf



> Cheers,
>
> Matthew
>
> On Thu, Jul 4, 2024 at 11:27 AM Marten van Kerkwijk
>  wrote:
> >
> > Hi All,
> >
> > I agree with Dan that the actual contributions to the documentation are
> > of little value: it is not easy to write good documentation, with
> > examples that show not just the mechnanics but the purpose of the
> > function, i.e., go well beyond just showing some random inputs and
> > outputs.  And poorly constructed examples are detrimental in that they
> > just hide the fact that the documentation is bad.
> >
> > I also second his worries about ecological and social costs.
> >
> > But let me add a third issue: the costs to maintainers.  I had a quick
> > glance at some of those PRs when they were first posted, but basically
> > decided they were not worth my time to review.  For a human contributor,
> > I might well have decided differently, since helping someone to improve
> > their contribution often leads to higher quality further contributions.
> > But here there seems to be no such hope.
> >
> > All the best,
> >
> > Marten
> >
> > Daniele Nicolodi  writes:
> >
> > > On 03/07/24 23:40, Matthew Brett wrote:
> > >> Hi,
> > >>
> > >> We recently got a set of well-labeled PRs containing (reviewed)
> > >> AI-generated code:
> > >>
> > >> https://github.com/numpy/numpy/pull/26827
> > >> https://github.com/numpy/numpy/pull/26828
> > >> https://github.com/numpy/numpy/pull/26829
> > >> https://github.com/numpy/numpy/pull/26830
> > >> https://github.com/numpy/numpy/pull/26831
> > >>
> > >> Do we have a policy on AI-generated code?   It seems to me that
> > >> AI-code in general must be a license risk, as the AI may well generate
> > >> code that was derived from, for example, code with a GPL-license.
> > >
> > > There is definitely the issue of copyright to keep in mind, but I see
> > > two other issues: the quality of the contributions and one moral issue.
> > >
> > > IMHO the PR linked above are not high quality contributions: for
> > > example, the added examples are often redundant with each other. In my
> > > experience these are representative of automatically generate content:
> > > as there is little to no effort involved into writing it, the content
> is
> > > often repetitive and with very low information density. In the case of
> > > documentation, I find this very detrimental to the overall quality.
> > >
> > > Contributions generated with AI have huge ecological and social costs.
> > > Encouraging AI generated contributions, especially where there is
> > > absolutely no need to involve AI to get to the solution, as in the
> > > examples above, makes the project co-responsible for these costs.
> > >
> > > Cheers,
> > > Dan
> > >
> > > ___
> > > NumPy-Discussion mailing list -- numpy-discussion@python.org
> > > To unsubscribe send an email to numpy-discu

[Numpy-discussion] Re: Policy on AI-generated code

2024-07-04 Thread Ralf Gommers
On Thu, Jul 4, 2024 at 1:34 PM Matthew Brett 
wrote:

> Hi,
>
> On Thu, Jul 4, 2024 at 12:20 PM Ralf Gommers 
> wrote:
> >
> >
> >
> > On Thu, Jul 4, 2024 at 12:55 PM Matthew Brett 
> wrote:
> >>
> >> Sorry - reposting from my subscribed address:
> >>
> >> Hi,
> >>
> >> Sorry to top-post!  But - I wanted to bring the discussion back to
> >> licensing.  I have great sympathy for the ecological and code-quality
> >> concerns, but licensing is a separate question, and, it seems to me,
> >> an urgent question.
> >>
> >> Imagine I asked some AI to give me code to replicate a particular
> algorithm A.
> >>
> >> It is perfectly possible that the AI will largely or completely
> >> reproduce some existing GPL code for A, from its training data.  There
> >> is no way that I could know that the AI has done that without some
> >> substantial research.  Surely, this is a license violation of the GPL
> >> code?   Let's say we accept that code.  Others pick up the code and
> >> modify it for other algorithms.  The code-base gets infected with GPL
> >> code, in a way that will make it very difficult to disentangle.
> >
> >
> > This is a question that's topical for all of open source, and usages of
> CoPilot & co. We're not going to come to any insightful answer here that is
> specific to NumPy. There's a ton of discussion in a lot of places; someone
> needs to research/summarize that to move this forward. Debating it from
> scratch here is unlikely to yield new arguments imho.
>
> Right - I wasn't expecting a detailed discussion on the merits - only
> some thoughts on policy for now.
>
> > I agree with Rohit's: "it is probably hopeless to enforce a ban on AI
> generated content". There are good ways to use AI code assistant tools and
> bad ones; we in general cannot know whether AI tools were used at all by a
> contributor (just like we can't know whether something was copied from
> Stack Overflow), nor whether when it's done the content is derived enough
> to fall under some other license. The best we can do here is add a warning
> to the contributing docs and PR template about this, saying the contributor
> needs to be the author so copied or AI-generated content needs to not
> contain things that are complex enough to be copyrightable (none of the
> linked PRs come close to this threshold).
>
> Yes, these PRs are not the concern - but I believe we do need to plan
> now for the future.
>
> I agree it is hard to enforce, but it seems to me it would be a
> reasonable defensive move to say - for now - that authors will need to
> take full responsibility for copyright, and that, as of now,
> AI-generated code cannot meet that standard, so we require authors to
> turn off AI-generation when writing code for Numpy.
>

I don't think that that is any more reasonable than asking contributors to
not look at Stack Overflow at all, or to not look at any other code base
for any reason. I bet many contributors may not even know whether the
auto-complete functionality in their IDE comes from a regular language
server (see https://langserver.org/) or an AI-enhanced one.

I think the two options are:
(A) do nothing yet, wait until the tools mature to the point where they can
actually do what you're worrying about here (at which point there may be
more insight/experience in the open source community about how to deal with
the problem.
(B) add a note along the lines I suggested as an option above ("... not
contain things that are complex enough to be copyrightable ...")

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Policy on AI-generated code

2024-07-04 Thread Ralf Gommers
On Thu, Jul 4, 2024 at 5:08 PM Matthew Brett 
wrote:

> Hi,
>
> On Thu, Jul 4, 2024 at 3:41 PM Ralf Gommers 
> wrote:
> >
> >
> >
> > On Thu, Jul 4, 2024 at 1:34 PM Matthew Brett 
> wrote:
> >>
> >> Hi,
> >>
> >> On Thu, Jul 4, 2024 at 12:20 PM Ralf Gommers 
> wrote:
> >> >
> >> >
> >> >
> >> > On Thu, Jul 4, 2024 at 12:55 PM Matthew Brett <
> matthew.br...@gmail.com> wrote:
> >> >>
> >> >> Sorry - reposting from my subscribed address:
> >> >>
> >> >> Hi,
> >> >>
> >> >> Sorry to top-post!  But - I wanted to bring the discussion back to
> >> >> licensing.  I have great sympathy for the ecological and code-quality
> >> >> concerns, but licensing is a separate question, and, it seems to me,
> >> >> an urgent question.
> >> >>
> >> >> Imagine I asked some AI to give me code to replicate a particular
> algorithm A.
> >> >>
> >> >> It is perfectly possible that the AI will largely or completely
> >> >> reproduce some existing GPL code for A, from its training data.
> There
> >> >> is no way that I could know that the AI has done that without some
> >> >> substantial research.  Surely, this is a license violation of the GPL
> >> >> code?   Let's say we accept that code.  Others pick up the code and
> >> >> modify it for other algorithms.  The code-base gets infected with GPL
> >> >> code, in a way that will make it very difficult to disentangle.
> >> >
> >> >
> >> > This is a question that's topical for all of open source, and usages
> of CoPilot & co. We're not going to come to any insightful answer here that
> is specific to NumPy. There's a ton of discussion in a lot of places;
> someone needs to research/summarize that to move this forward. Debating it
> from scratch here is unlikely to yield new arguments imho.
> >>
> >> Right - I wasn't expecting a detailed discussion on the merits - only
> >> some thoughts on policy for now.
> >>
> >> > I agree with Rohit's: "it is probably hopeless to enforce a ban on AI
> generated content". There are good ways to use AI code assistant tools and
> bad ones; we in general cannot know whether AI tools were used at all by a
> contributor (just like we can't know whether something was copied from
> Stack Overflow), nor whether when it's done the content is derived enough
> to fall under some other license. The best we can do here is add a warning
> to the contributing docs and PR template about this, saying the contributor
> needs to be the author so copied or AI-generated content needs to not
> contain things that are complex enough to be copyrightable (none of the
> linked PRs come close to this threshold).
> >>
> >> Yes, these PRs are not the concern - but I believe we do need to plan
> >> now for the future.
> >>
> >> I agree it is hard to enforce, but it seems to me it would be a
> >> reasonable defensive move to say - for now - that authors will need to
> >> take full responsibility for copyright, and that, as of now,
> >> AI-generated code cannot meet that standard, so we require authors to
> >> turn off AI-generation when writing code for Numpy.
> >
> >
> > I don't think that that is any more reasonable than asking contributors
> to not look at Stack Overflow at all, or to not look at any other code base
> for any reason. I bet many contributors may not even know whether the
> auto-complete functionality in their IDE comes from a regular language
> server (see https://langserver.org/) or an AI-enhanced one.
> >
> > I think the two options are:
> > (A) do nothing yet, wait until the tools mature to the point where they
> can actually do what you're worrying about here (at which point there may
> be more insight/experience in the open source community about how to deal
> with the problem.
>
> Have we any reason to think that the tools are not doing this now?


Yes, namely that tools aren't capable yet of generating the type of code
that would land in NumPy. And if it's literal code from some other project
for the few things that are standard (e.g., C/C++ code for a sorting
algorithm), we'd anyway judge if it was authored by the PR submitter or not
(I've caught many issues like that with large PRs from new contributors,
e.g. translating from Matlab code directly).


>I ran one of my exercises throug

[Numpy-discussion] Re: Policy on AI-generated code

2024-07-04 Thread Ralf Gommers
On Thu, Jul 4, 2024 at 8:42 PM Matthew Brett 
wrote:

> Hi,
>
> On Thu, Jul 4, 2024 at 6:44 PM Ralf Gommers 
> wrote:
> >
> >
> >
> > On Thu, Jul 4, 2024 at 5:08 PM Matthew Brett 
> wrote:
> >>
> >> Hi,
> >>
> >> On Thu, Jul 4, 2024 at 3:41 PM Ralf Gommers 
> wrote:
> >> >
> >> >
> >> >
> >> > On Thu, Jul 4, 2024 at 1:34 PM Matthew Brett 
> wrote:
> >> >>
> >> >> Hi,
> >> >>
> >> >> On Thu, Jul 4, 2024 at 12:20 PM Ralf Gommers 
> wrote:
> >> >> >
> >> >> >
> >> >> >
> >> >> > On Thu, Jul 4, 2024 at 12:55 PM Matthew Brett <
> matthew.br...@gmail.com> wrote:
> >> >> >>
> >> >> >> Sorry - reposting from my subscribed address:
> >> >> >>
> >> >> >> Hi,
> >> >> >>
> >> >> >> Sorry to top-post!  But - I wanted to bring the discussion back to
> >> >> >> licensing.  I have great sympathy for the ecological and
> code-quality
> >> >> >> concerns, but licensing is a separate question, and, it seems to
> me,
> >> >> >> an urgent question.
> >> >> >>
> >> >> >> Imagine I asked some AI to give me code to replicate a particular
> algorithm A.
> >> >> >>
> >> >> >> It is perfectly possible that the AI will largely or completely
> >> >> >> reproduce some existing GPL code for A, from its training data.
> There
> >> >> >> is no way that I could know that the AI has done that without some
> >> >> >> substantial research.  Surely, this is a license violation of the
> GPL
> >> >> >> code?   Let's say we accept that code.  Others pick up the code
> and
> >> >> >> modify it for other algorithms.  The code-base gets infected with
> GPL
> >> >> >> code, in a way that will make it very difficult to disentangle.
> >> >> >
> >> >> >
> >> >> > This is a question that's topical for all of open source, and
> usages of CoPilot & co. We're not going to come to any insightful answer
> here that is specific to NumPy. There's a ton of discussion in a lot of
> places; someone needs to research/summarize that to move this forward.
> Debating it from scratch here is unlikely to yield new arguments imho.
> >> >>
> >> >> Right - I wasn't expecting a detailed discussion on the merits - only
> >> >> some thoughts on policy for now.
> >> >>
> >> >> > I agree with Rohit's: "it is probably hopeless to enforce a ban on
> AI generated content". There are good ways to use AI code assistant tools
> and bad ones; we in general cannot know whether AI tools were used at all
> by a contributor (just like we can't know whether something was copied from
> Stack Overflow), nor whether when it's done the content is derived enough
> to fall under some other license. The best we can do here is add a warning
> to the contributing docs and PR template about this, saying the contributor
> needs to be the author so copied or AI-generated content needs to not
> contain things that are complex enough to be copyrightable (none of the
> linked PRs come close to this threshold).
> >> >>
> >> >> Yes, these PRs are not the concern - but I believe we do need to plan
> >> >> now for the future.
> >> >>
> >> >> I agree it is hard to enforce, but it seems to me it would be a
> >> >> reasonable defensive move to say - for now - that authors will need
> to
> >> >> take full responsibility for copyright, and that, as of now,
> >> >> AI-generated code cannot meet that standard, so we require authors to
> >> >> turn off AI-generation when writing code for Numpy.
> >> >
> >> >
> >> > I don't think that that is any more reasonable than asking
> contributors to not look at Stack Overflow at all, or to not look at any
> other code base for any reason. I bet many contributors may not even know
> whether the auto-complete functionality in their IDE comes from a regular
> language server (see https://langserver.org/) or an AI-enhanced one.
> >> >
> >> > I think the two options are:
> >> > (A) do nothing yet, wait until the tools mature to the point where
> they ca

[Numpy-discussion] Re: Windows 11 arm64 wheel

2024-07-11 Thread Ralf Gommers
On Thu, Jul 11, 2024 at 2:20 PM Andrew Nelson  wrote:

>
>
> On Thu, Jul 11, 2024, 22:08 slobodan.miletic--- via NumPy-Discussion <
> numpy-discussion@python.org> wrote:
>
>> Hi,
>>
>> I am writing on behalf of my team from Endava company. Our task is to
>> work with opensource community, and setup multiple applications for use on
>> win11 arm64 machines.
>> One of the tasks is to understand the problems and if possible help NumPy
>> team in setting up win11 arm64 compatible wheel on the PyPi.
>> I saw that there were some earlier discussions about this subject, and as
>> I understood problem was that at that moment this configuration was not so
>> common and there were no appropriate CI slaves to create the wheel.
>> Today there are more devices in this configuration available on the
>> market and additional are announced, so I wanted to check if there are
>> plans for creating this wheel, and if we can somehow help with that work?
>>
>
Thanks for offering help Slobodan! Contributions here are welcome - and
it's hopefully not too difficult if you have a dev setup for Windows on Arm
locally.


> This is discussed in detail in https://github.com/numpy/numpy/issues/22530.
> TLDR - we need CI providers to offer win arm before it's sensible to make
> wheels. Firstly because we need to make them (less secure to upload wheels
> not made on CI infra), secondly because we need to test them. If less
> tested wheels give rise to lots of issues reported by the community, then
> this generates large maintainer workload.
>

A good start to get this rolling could be:
1. Address open issues for Windows on Arm. There's only one right now (
https://github.com/numpy/numpy/issues/25626) - we'd be happy to accept a PR
for that, we don't need CI to validate it.
2. Setting up a single CI job that builds an arm64 wheel on x86-64 (build
only, no test - unless emulation works). gh-22530 has comments saying it
should work.

Once those things are done, we can consider uploading wheels to
https://anaconda.org/scientific-python-nightly-wheels/numpy/.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: PR - can I get a new review?

2024-07-23 Thread Ralf Gommers
On Mon, Jul 22, 2024 at 3:36 AM Jake S. 
wrote:

> Hi all, it's been a couple of months, so I'm asking for a review again
>  of my typing PR.  Original
> reviewer requested a useful additional feature, but hasn't been able to
> provide feedback since then.  The feature turned out to be problematic and
> fragile without a PEP on TypeVarTuple covariance/bounds.  I've needed a
> decision on whether I should (a) do the feature, leaving out difficult
> tests until the feature is supported by python, and note this in the docs,
> (b) remove that feature from the PR (my choice),
>

You and Joren both seem to be in favor of (b), so going with that is the
way to go.


> or (c) commit to a compromise that will be non-ideal when python *does*
> support the feature.
>
> I'd like to know the path forward from someone with approval authority; I
> don't want to keep troubleshooting and maintaining the PR without a way
> forward.
>

Thanks for bringing this up here Jake, and for all your work on this PR. I
hope we've just resolved this issue (
https://github.com/numpy/numpy/pull/26081#issuecomment-2246283338).

Cheers,
Ralf



>
> Thanks,
> Jake
>
>
> Jake Stevens-Haas
> Ph.D. Candidate
> Applied Mathematics
> University of Washington
> +1-(908)-462-4196
> jacob.stevens.h...@gmail.com
> j...@uw.edu
>
>
>
> On Tue, May 14, 2024 at 2:26 PM Jake S. 
> wrote:
>
>> Ah, thanks Nathan!  Most of them resolved, but one still fails (macOS
>> x86-64 conda
>> ).
>> The spin build gives Error: 'types.SimpleNamespace' object has no
>> attribute 'dir_info'; aborting.  Not sure if that relates to handling
>> the gcov argument, or whether it will block the PR?
>>
>> - Jake
>>
>> Jake Stevens-Haas
>> Ph.D. Candidate
>> Applied Mathematics
>> University of Washington
>> +1-(908)-462-4196
>> jacob.stevens.h...@gmail.com
>> j...@uw.edu
>>
>>
>>
>> On Tue, May 7, 2024 at 11:53 AM Nathan  wrote:
>>
>>> I think most of the build failures you’re seeing would be fixed by
>>> merging with or rebasing on the latest main branch.
>>>
>>> Note that there is currently an issue with some of the windows CI
>>> runners, so you’ll see failures related to our spin configuration failing
>>> to handle a gcov argument that was added in spin 0.9 released a couple days
>>> ago.
>>>
>>> On Mon, May 6, 2024 at 8:48 PM Jake S. 
>>> wrote:
>>>
 Hi community,

 PR 26081  is about making
 numpy's ShapeType covariant and bound to a tuple of ints.  The community
 has requested this occasionally in issue 16544
 .  I'm reaching out via
 the listserv because it's been a few months, and I don't want it to get too
 stale.  I could really use some help pushing it over the finish line.

 Summary:
 Two numpy reviewers and one interested community member reviewed the PR
 and asked for a type alias akin to npt.NDArray that allowed shape.  I
 worked through the issues with TypeVarTuple and made npt.Array, and it was
 fragile, but passing CI.  After a few months passed, I returned to fix the
 fragility in the hopes of getting some more attention, but now it fails CI
 in some odd builds (passes the mypy bit).  I have no idea how to get these
 to pass, as they appear unrelated to anything I've worked on (OpenBLAS on
 windows, freeBSD...?).

 Thanks,
 Jake
 ___
 NumPy-Discussion mailing list -- numpy-discussion@python.org
 To unsubscribe send an email to numpy-discussion-le...@python.org
 https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
 Member address: nathan12...@gmail.com

>>> ___
>>> NumPy-Discussion mailing list -- numpy-discussion@python.org
>>> To unsubscribe send an email to numpy-discussion-le...@python.org
>>> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
>>> Member address: jacob.stevens.h...@gmail.com
>>>
>> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@googlemail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Musllinux image bump to 1_2

2024-08-11 Thread Ralf Gommers
On Sun, Aug 11, 2024 at 1:29 PM Andrew Nelson  wrote:

> Dear list,
> PR https://github.com/numpy/numpy/pull/27088 contains changes to bump the
> musllinux image to 1_2 from 1_1. This is because the 1_1 image is end of
> life (https://github.com/pypa/manylinux/issues/1629)
>
> At the moment the intent is to make this change after the 2.1 release,
> i.e. the 2.2 release.
>

Thanks Andrew. That sounds right to me. Now that the 2.1.x branch has been
created, I think the PR can be merged.

Cheers,
Ralf
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] Re: Windows 11 arm64 wheel

2024-08-12 Thread Ralf Gommers
On Mon, Aug 12, 2024 at 12:23 PM Matti Picus  wrote:

> On 12/08/2024 11:55, slobodan.miletic--- via NumPy-Discussion wrote:
>
> > As the bug fix is merged we are starting the investigation on the CI job.
> > I have a few questions about this:
> > 1) Are there some additional instructions for making and running the
> numpy CI jobs and cross compilation available in the documentation?
> > 2) Do we need to have arm64 scipy-openblas released to build numpy wheel?
> > ___
> > NumPy-Discussion mailing list -- numpy-discussion@python.org
> > To unsubscribe send an email to numpy-discussion-le...@python.org
> > https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> > Member address: matti.pi...@gmail.com
>
>
> 1. You can look at .github/workflows/linux_qemu.yml that use qemu/docker
> to run.  AFAIK there is no documentation about cross-compilation.
>

There is: https://numpy.org/devdocs/building/cross_compilation.html

https://cibuildwheel.pypa.io/en/stable/faq/#windows-arm64 says that
cibuildwheel only supports `setuptools` and `setuptools-rust` so far, so
that's not particularly helpful yet. Neither is the `linux_qemu.yml`
example I think, that's the furthest away from what we need I think (unless
I missed something and QEMU or Docker actually supports Windows on Arm).

I'm not sure if this should go the `crossenv` way (see
https://github.com/benfogle/crossenv), or a regular cross compile with a
cross file and build/host envs. I think the latter won't quite work until
PEP 739 (https://peps.python.org/pep-0739/) lands.


2. It would be preferable to build a scipy-openblas wheel but we could
> use a "vanilla" build of OpenBLAS as well. We do expect to use 64-bit
> interfaces, but scipy does not do so yet, so we would need two builds. I
> don't know how @gholke compiles OpenBLAS/NumPy and where there is a
> gfortran for windows arm64, maybe worthwhile asking?
>

I recommend getting step (1) to work first, without any BLAS library. That
may be challenging enough - if there's a blocker, then there is no point
looking into cross-compiling OpenBLAS before native arm64 CI runners become
available.

Cheers,
Ralf


> Matti
>
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: ralf.gomm...@gmail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com


[Numpy-discussion] mailing list moved

2017-03-24 Thread Ralf Gommers
Hi all,

This mailing list moved, it's now numpy-discussion@python.org.

Please update the stored address in the contacts list of your email client
- messages sent to @scipy.org will not arrive anymore.

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] migration of all scipy.org mailing lists

2017-03-26 Thread Ralf Gommers
On Fri, Mar 24, 2017 at 9:57 PM, Ralf Gommers 
wrote:


> On Fri, Mar 24, 2017 at 12:27 AM, Neal Becker  wrote:
>
>> Ralf Gommers wrote:
>>
>> > On Thu, Mar 23, 2017 at 12:18 AM, Neal Becker 
>> wrote:
>> >
>> >> Has anyone taken care of notifying gmane about this?
>> >>
>> >
>> > We will have to update this info in quite a few places after the move is
>> > done. Including Gmane, although that site hasn't been working for half a
>> > year so is pretty low on the priority list.
>> >
>> > Ralf
>>
>> I'm reading/writing to you via gmane, so I think it is working :)
>>
>
> No it's not, archives are the key feature of Gmane (and what we link to
> from http://scipy.org/scipylib/mailing-lists.html) and those haven't been
> working since last September. See https://lars.ingebrigtsen.no/
> 2016/07/28/the-end-of-gmane/ for why.
>
>
>
> Your mail forwarding still happens to work, but that's not nearly as
> interesting a feature. Since Gmane is more or less unmaintained and at the
> moment http://gmane.org/ gives me a blank page, I don't think I'll bother
> to contact them (unless archives come back).
>

The move is complete, and I have updated the mailing list addresses at:
- http://scipy.org/scipylib/mailing-lists.html (done)
- numpy codebase (PR https://github.com/numpy/numpy/pull/8840)
- scipy codebase (PR https://github.com/scipy/scipy/pull/7226)
- Nabble (opened issue
http://support.nabble.com/All-scipy-org-mailing-list-moved-to-python-org-td7597902.html),
also for ipython-dev

If anyone knows of other places, please let me know (or fix it).

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Fwd: [numfocus] Grants up to $3k available to NumFOCUS projects (sponsored & affiliated)

2017-03-27 Thread Ralf Gommers
Hi all,

For those who did not see the call on the NumFOCUS list, the below may be
of interest.

I also have a proposal idea: a redesign of numpy.org. Our website is very
poor, both in terms of content and design. If a good designer spends 40
hours on it, that should be enough to change it into something nice (or at
least not embarassing).

What do you think? Other ideas?

Cheers,
Ralf




-- Forwarded message --
From: Gina Helfrich 
Date: Fri, Mar 17, 2017 at 10:51 AM
Subject: Re: [numfocus] Grants up to $3k available to NumFOCUS projects
(sponsored & affiliated)
To: numfo...@googlegroups.com, d...@numfocus.org


There is no specific template, but proposals should be kept under 2 pages.
Maximum 1 submission per project.

Required elements of the proposal are:

   - title
   - project description
   - benefit to project/community
   - project team
   - and budget

Submit proposals to i...@numfocus.org

Best,
Gina

On Thu, Mar 16, 2017 at 1:13 PM, Bob Carpenter  wrote:

> Is there a format you're expecting for proposals?
>
> Are applications limited to one per project?
>
> Hard to imagine a project won't have something to do with $3K,
> so I imagine you'll get applications from all of the projects if
> the proposals take less than an hour or two to put together.
>
> - Bob
>
> > On Mar 16, 2017, at 12:14 PM, Gina Helfrich  wrote:
> >
> > Call for Proposals - Small Development Grants
> >
> > NumFOCUS is asking for proposals from its sponsored and affiliated
> projects for targeted small development projects with a clear benefit to
> those projects. This call is motivated by the success of our 2016
> end-of-year fundraising drive; we want to direct the donated funds to our
> projects in a way that has impact and visibility to donors and the wider
> community.
> >
> > There are no restrictions on what the funding can be used for. Whether
> it’s code development, documentation work, an educational, sustainability
> or diversity initiative, or yet something else, we trust the projects
> themselves to understand what they need and explain that need in the
> proposal.
> >
> > Available Funding:
> >   • Up to $3,000 per proposal
> >   • Allocated funding is $9,000; depending on the number and quality
> of proposals this may be adjusted up or down.
> >
> > Eligibility:
> >   • Proposals must be approved by the leadership of a NumFOCUS
> sponsored or affiliated project.
> >   • Proposed work must have a clear outcome, achievable within 2017.
> >   • The call is open to applicants from any nationality and can be
> performed at any university, institute or business worldwide (US export
> laws permitting).
> >
> > Timeline:
> >   • Mid-March 2017: Call for Proposals released
> >   • 3 April 2017: deadline for proposal submissions
> >   • 17 April: successful proposals announced
> >
> > --
> > Dr. Gina Helfrich
> > Communications Director, NumFOCUS
> > g...@numfocus.org
> > 512-222-5449
> >
> >
> >
> >
> > --
> > You received this message because you are subscribed to the Google
> Groups "NumFOCUS" group.
> > To unsubscribe from this group and stop receiving emails from it, send
> an email to numfocus+unsubscr...@googlegroups.com.
> > For more options, visit https://groups.google.com/d/optout.
>
> --
> You received this message because you are subscribed to the Google Groups
> "NumFOCUS" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to numfocus+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>



-- 
Dr. Gina Helfrich
Communications Director, NumFOCUS
g...@numfocus.org
512-222-5449 <(512)%20222-5449>


 

-- 
You received this message because you are subscribed to the Google Groups
"NumFOCUS" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to numfocus+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: [numfocus] Grants up to $3k available to NumFOCUS projects (sponsored & affiliated)

2017-03-27 Thread Ralf Gommers
On Mon, Mar 27, 2017 at 11:33 PM, Julian Taylor <
jtaylor.deb...@googlemail.com> wrote:

> I have two ideas under one big important topic: make numpy python3
> compatible.
>
> The first fits pretty well with the grant size and nobody wants to do it
> for free:
> - fix our text IO functions under python3 and support multiple
> encodings, not only latin1.
> Reasonably simple to do, slap encoding arguments on the functions,
> generate test cases and somehow keep backward compatibility. Some
> prelimary unfinished work is in https://github.com/numpy/numpy/pull/4208


I like that idea, it's a recurring pain point. Are you interested to work
on it, or are you thinking to advertise the idea here to see if anyone
steps up?


- add ascii/latin1 dtype to support a compact python3 string array,
> deprecate 's' dtype which has different meaning in python2 and 3
> This one is probably too big for 3k though.
>

Agreed, that won't fit in such a small grant.

Ralf



>
> On 27.03.2017 10:41, Ralf Gommers wrote:
> > Hi all,
> >
> > For those who did not see the call on the NumFOCUS list, the below may
> > be of interest.
> >
> > I also have a proposal idea: a redesign of numpy.org <http://numpy.org>.
> > Our website is very poor, both in terms of content and design. If a good
> > designer spends 40 hours on it, that should be enough to change it into
> > something nice (or at least not embarassing).
> >
> > What do you think? Other ideas?
> >
> > Cheers,
> > Ralf
> >
> >
> >
> >
> > -- Forwarded message --
> > From: *Gina Helfrich* mailto:g...@numfocus.org>>
> > Date: Fri, Mar 17, 2017 at 10:51 AM
> > Subject: Re: [numfocus] Grants up to $3k available to NumFOCUS projects
> > (sponsored & affiliated)
> > To: numfo...@googlegroups.com <mailto:numfo...@googlegroups.com>,
> > d...@numfocus.org <mailto:d...@numfocus.org>
> >
> >
> > There is no specific template, but proposals should be kept under 2
> > pages. Maximum 1 submission per project.
> >
> > Required elements of the proposal are:
> >
> >   * title
> >   * project description
> >   * benefit to project/community
> >   * project team
> >   * and budget
> >
> > Submit proposals to i...@numfocus.org <mailto:i...@numfocus.org>
> >
> > Best,
> > Gina
> >
> > On Thu, Mar 16, 2017 at 1:13 PM, Bob Carpenter  > <mailto:c...@alias-i.com>> wrote:
> >
> > Is there a format you're expecting for proposals?
> >
> > Are applications limited to one per project?
> >
> > Hard to imagine a project won't have something to do with $3K,
> > so I imagine you'll get applications from all of the projects if
> > the proposals take less than an hour or two to put together.
> >
> > - Bob
> >
> > > On Mar 16, 2017, at 12:14 PM, Gina Helfrich  > <mailto:g...@numfocus.org>> wrote:
> > >
> > > Call for Proposals - Small Development Grants
> > >
> > > NumFOCUS is asking for proposals from its sponsored and affiliated
> > projects for targeted small development projects with a clear
> > benefit to those projects. This call is motivated by the success of
> > our 2016 end-of-year fundraising drive; we want to direct the
> > donated funds to our projects in a way that has impact and
> > visibility to donors and the wider community.
> > >
> > > There are no restrictions on what the funding can be used for.
> > Whether it’s code development, documentation work, an educational,
> > sustainability or diversity initiative, or yet something else, we
> > trust the projects themselves to understand what they need and
> > explain that need in the proposal.
> > >
> > > Available Funding:
> > >   • Up to $3,000 per proposal
> > >   • Allocated funding is $9,000; depending on the number and
> > quality of proposals this may be adjusted up or down.
> > >
> > > Eligibility:
> > >   • Proposals must be approved by the leadership of a NumFOCUS
> > sponsored or affiliated project.
> > >   • Proposed work must have a clear outcome, achievable within
> > 2017.
> > >   • The call is open to applicants from any nationality and
> > can be performed at any university, institute or business worldwide
> > (US export laws permitting).
> > 

Re: [Numpy-discussion] migration of all scipy.org mailing lists

2017-03-27 Thread Ralf Gommers
On Tue, Mar 28, 2017 at 8:22 AM, Pauli Virtanen  wrote:

> Sun, 26 Mar 2017 21:05:47 +1300, Ralf Gommers kirjoitti:
> [clip]
> > The move is complete, and I have updated the mailing list addresses at:
> > - http://scipy.org/scipylib/mailing-lists.html (done)
> > - numpy codebase (PR https://github.com/numpy/numpy/pull/8840)
> > - scipy codebase (PR https://github.com/scipy/scipy/pull/7226)
> > - Nabble (opened issue
> > http://support.nabble.com/All-scipy-org-mailing-list-moved-to-python-
> org-td7597902.html),
> > also for ipython-dev
> >
> > If anyone knows of other places, please let me know (or fix it).
>
> Should the Read/Search links be removed from this page:
>
> https://scipy.org/scipylib/mailing-lists.html
>
> They point to Gmane, which currently appears to just serve a blank page.
> So maybe Gmane really is going away.
>

Yes, I think we should replace them with Nabble (unless someone has a
better alternative?).

I was planning to rework the mailing-lists page for that and a few other
things like explaining bottom posting.

Ralf



>
> The Gmane NNTP server still works, but apparently you cannot post via it
> because the To-address still points to @scipy.org. Not sure if there is
> someone who can fix this any more.
>
> --
> Pauli Virtanen
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] migration of all scipy.org mailing lists

2017-03-28 Thread Ralf Gommers
On Tue, Mar 28, 2017 at 8:50 AM, Ralf Gommers 
wrote:

>
>
> On Tue, Mar 28, 2017 at 8:22 AM, Pauli Virtanen  wrote:
>
>> Sun, 26 Mar 2017 21:05:47 +1300, Ralf Gommers kirjoitti:
>> [clip]
>> > The move is complete, and I have updated the mailing list addresses at:
>> > - http://scipy.org/scipylib/mailing-lists.html (done)
>> > - numpy codebase (PR https://github.com/numpy/numpy/pull/8840)
>> > - scipy codebase (PR https://github.com/scipy/scipy/pull/7226)
>> > - Nabble (opened issue
>> > http://support.nabble.com/All-scipy-org-mailing-list-moved-to-python-
>> org-td7597902.html),
>> > also for ipython-dev
>> >
>> > If anyone knows of other places, please let me know (or fix it).
>>
>> Should the Read/Search links be removed from this page:
>>
>> https://scipy.org/scipylib/mailing-lists.html
>>
>> They point to Gmane, which currently appears to just serve a blank page.
>> So maybe Gmane really is going away.
>>
>
> Yes, I think we should replace them with Nabble (unless someone has a
> better alternative?).
>
> I was planning to rework the mailing-lists page for that and a few other
> things like explaining bottom posting.
>

Done in https://github.com/scipy/scipy.org/pull/203.

There is no scipy-dev Nabble forum. If anyone is familiar with Nabble (or
wants to lbecome familiar with it) and can create that forum, that would be
much appreciated. Right now there's no good way to search scipy-dev.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: SciPy2017 Sprints FinAid for sprint leaders/core devs

2017-03-30 Thread Ralf Gommers
On Thu, Mar 30, 2017 at 10:55 AM, Stefan van der Walt 
wrote:

> On Wed, Mar 29, 2017, at 13:01, Charles R Harris wrote:
>
> On Tue, Mar 28, 2017 at 5:16 PM, Nathaniel Smith  wrote:
>
> In case anyone is interested in helping run a NumPy sprint at SciPy this
> year:
>
>
> I haven't found numpy sprints to be very productive. However, I think it
> would be useful if we could have a developers meeting sometime this year.
>
>
> Yes, I think it's helpful to think of the sprints as an onboarding
> opportunity, rather than as a focused working session.  Many of the
> scikit-image core contributors joined the team this way, at least.
>
>
Agreed, and I would call that productive. Getting even one new maintainer
involved is worth organizing multiple sprints for.

That said, also +1 to a developer meeting this year. It'd be good if we
could combine it with the NumFOCUS summit or a relevant conference in the
second half of the year.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: SciPy2017 Sprints FinAid for sprint leaders/core devs

2017-03-30 Thread Ralf Gommers
On Fri, Mar 31, 2017 at 11:59 AM, Sebastian Berg  wrote:

> On Thu, 2017-03-30 at 22:46 +1300, Ralf Gommers wrote:
> >
> 
> >
> > Agreed, and I would call that productive. Getting even one new
> > maintainer involved is worth organizing multiple sprints for.
> >
> > That said, also +1 to a developer meeting this year. It'd be good if
> > we could combine it with the NumFOCUS summit or a relevant conference
> > in the second half of the year.
>
> Would be good, even if there is nothing big going on.
>
> Can we gather possible dates and possible (personal) preferences? Here
> is a start:
>
> * SciPy (Austin, TX): July 10-16
> * EuroScipy (Germany): August 23-27
> * NumFocus Summit?
>

Austin, October (exact date TBD). I intend to plan a longer trip to the US
around this summit, and I think at least one other core dev should go
there. So this one has my preference.


> * PyData Events??
>

Sticking to the ones in the second half of the year and in the US or
Western Europe:

PyData Berlin, June 30 - July 2
PyData Seattle July 5 -7

Other options:

JupyterCon, New York, August 22-25
Strata Data Conference, September 25-28


>
> Personally, I probably can't make longer trips until some time in July.
>  time around then).


Same here.

Ralf


> We won't find a perfect time anyway probably, so
> personal preferences or not, whoever is willing to organize a bit can
> decide on the time and place as far as I am concerned :).
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Fwd: [numfocus] Grants up to $3k available to NumFOCUS projects (sponsored & affiliated)

2017-03-31 Thread Ralf Gommers
On Mon, Mar 27, 2017 at 11:42 PM, Ralf Gommers 
wrote:

>
>
> On Mon, Mar 27, 2017 at 11:33 PM, Julian Taylor <
> jtaylor.deb...@googlemail.com> wrote:
>
>> I have two ideas under one big important topic: make numpy python3
>> compatible.
>>
>> The first fits pretty well with the grant size and nobody wants to do it
>> for free:
>> - fix our text IO functions under python3 and support multiple
>> encodings, not only latin1.
>> Reasonably simple to do, slap encoding arguments on the functions,
>> generate test cases and somehow keep backward compatibility. Some
>> prelimary unfinished work is in https://github.com/numpy/numpy/pull/4208
>
>
> I like that idea, it's a recurring pain point. Are you interested to work
> on it, or are you thinking to advertise the idea here to see if anyone
> steps up?
>

More thoughts on this anyone? Or preferences for this idea or the numpy.org
one? Submission deadline is April 3rd and we can only put in one proposal
this time, so we need to (a) make a choice between these ideas, and (b)
write up a proposal.

If there's not enough replies to this so the choice is clear cut, I will
send out a poll to the core devs.

Ralf



>
>
> - add ascii/latin1 dtype to support a compact python3 string array,
>> deprecate 's' dtype which has different meaning in python2 and 3
>> This one is probably too big for 3k though.
>>
>
> Agreed, that won't fit in such a small grant.
>
> Ralf
>
>
>
>>
>> On 27.03.2017 10:41, Ralf Gommers wrote:
>> > Hi all,
>> >
>> > For those who did not see the call on the NumFOCUS list, the below may
>> > be of interest.
>> >
>> > I also have a proposal idea: a redesign of numpy.org <http://numpy.org
>> >.
>> > Our website is very poor, both in terms of content and design. If a good
>> > designer spends 40 hours on it, that should be enough to change it into
>> > something nice (or at least not embarassing).
>> >
>> > What do you think? Other ideas?
>> >
>> > Cheers,
>> > Ralf
>> >
>> >
>> >
>> >
>> > -- Forwarded message --
>> > From: *Gina Helfrich* mailto:g...@numfocus.org>>
>> > Date: Fri, Mar 17, 2017 at 10:51 AM
>> > Subject: Re: [numfocus] Grants up to $3k available to NumFOCUS projects
>> > (sponsored & affiliated)
>> > To: numfo...@googlegroups.com <mailto:numfo...@googlegroups.com>,
>> > d...@numfocus.org <mailto:d...@numfocus.org>
>> >
>> >
>> > There is no specific template, but proposals should be kept under 2
>> > pages. Maximum 1 submission per project.
>> >
>> > Required elements of the proposal are:
>> >
>> >   * title
>> >   * project description
>> >   * benefit to project/community
>> >   * project team
>> >   * and budget
>> >
>> > Submit proposals to i...@numfocus.org <mailto:i...@numfocus.org>
>> >
>> > Best,
>> > Gina
>> >
>> > On Thu, Mar 16, 2017 at 1:13 PM, Bob Carpenter > > <mailto:c...@alias-i.com>> wrote:
>> >
>> > Is there a format you're expecting for proposals?
>> >
>> > Are applications limited to one per project?
>> >
>> > Hard to imagine a project won't have something to do with $3K,
>> > so I imagine you'll get applications from all of the projects if
>> > the proposals take less than an hour or two to put together.
>> >
>> > - Bob
>> >
>> > > On Mar 16, 2017, at 12:14 PM, Gina Helfrich > > <mailto:g...@numfocus.org>> wrote:
>> > >
>> > > Call for Proposals - Small Development Grants
>> > >
>> > > NumFOCUS is asking for proposals from its sponsored and affiliated
>> > projects for targeted small development projects with a clear
>> > benefit to those projects. This call is motivated by the success of
>> > our 2016 end-of-year fundraising drive; we want to direct the
>> > donated funds to our projects in a way that has impact and
>> > visibility to donors and the wider community.
>> > >
>> > > There are no restrictions on what the funding can be used for.
>> > Whether it’s code development, documentation work, an educational,
>> > sustainability or diversity initiative, or yet something else, we
>> > trust the p

Re: [Numpy-discussion] Fwd: [numfocus] Grants up to $3k available to NumFOCUS projects (sponsored & affiliated)

2017-04-03 Thread Ralf Gommers
On Mon, Apr 3, 2017 at 11:28 PM, Julian Taylor <
jtaylor.deb...@googlemail.com> wrote:

> On 31.03.2017 16:07, Julian Taylor wrote:
> > On 31.03.2017 15:51, Nathaniel Smith wrote:
> >> On Mar 31, 2017 1:15 AM, "Ralf Gommers"  >> <mailto:ralf.gomm...@gmail.com>> wrote:
> >>
> >>
> >>
> >> On Mon, Mar 27, 2017 at 11:42 PM, Ralf Gommers
> >> mailto:ralf.gomm...@gmail.com>> wrote:
> >>
> >>
> >>
> >> On Mon, Mar 27, 2017 at 11:33 PM, Julian Taylor
> >>  >> <mailto:jtaylor.deb...@googlemail.com>> wrote:
> >>
> >> I have two ideas under one big important topic: make numpy
> >> python3
> >> compatible.
> >>
> >> The first fits pretty well with the grant size and nobody
> >> wants to do it
> >> for free:
> >> - fix our text IO functions under python3 and support
> multiple
> >> encodings, not only latin1.
> >> Reasonably simple to do, slap encoding arguments on the
> >> functions,
> >> generate test cases and somehow keep backward compatibility.
> >> Some
> >> prelimary unfinished work is in
> >> https://github.com/numpy/numpy/pull/4208
> >> <https://github.com/numpy/numpy/pull/4208>
> >>
> >>
> >> I like that idea, it's a recurring pain point. Are you
> >> interested to work on it, or are you thinking to advertise the
> >> idea here to see if anyone steps up?
> >>
> >>
> >> More thoughts on this anyone? Or preferences for this idea or the
> >> numpy.org <http://numpy.org> one? Submission deadline is April 3rd
> >> and we can only put in one proposal this time, so we need to (a)
> >> make a choice between these ideas, and (b) write up a proposal.
> >>
> >> If there's not enough replies to this so the choice is clear cut, I
> >> will send out a poll to the core devs.
> >>
> >>
> >> Do we have anyone interested in doing the work in either case? That
> >> seems like the most important consideration to me...
>

Fair enough. Had a plan, but my weekend went a bit different than planned
so couldn't follow up on it.


> >>
> >> -n
> >>
> >
> > I could do the textio thing if no one shows up for numpy.org. I can
> > probably check again what is required in the next few days and write a
> > proposal.
> > The change will need reviewing in the end too, should that be
> > compensated too? It feels weird if not.
> >
>
> I have decided to not do it, as it is more or less just a bugfix and I
> currently do not feel capable of doing with added completion pressure.
>

Good call Julian. I struggled with the same thing - had a designer to do
the numpy.org work, but that still needed someone to do the content,
review, etc. Decided not to try to take that on, because I'm already
struggling to keep up.



> But I have collected some of related issues and discussions:
>

Thanks, I'm sure that'll be of use at some point.

Ralf


>
> https://github.com/numpy/numpy/issues/4600
> https://github.com/numpy/numpy/issues/3184
> http://numpy-discussion.10968.n7.nabble.com/using-loadtxt-
> to-load-a-text-file-in-to-a-numpy-array-tt35992.html#a36003
> # loadtxt
> https://github.com/numpy/numpy/pull/4208
> # genfromtxt
> http://numpy-discussion.10968.n7.nabble.com/genfromtxt-
> universal-newline-support-td37816.html
> https://github.com/dhomeier/numpy/commit/995ec93
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Long term plans for dropping Python 2.7

2017-04-14 Thread Ralf Gommers
On Sat, Apr 15, 2017 at 5:19 PM, Nathaniel Smith  wrote:

> On Fri, Apr 14, 2017 at 5:19 PM, Charles R Harris
>  wrote:
> > Hi All,
> >
> > It may be early to discuss dropping support for Python 2.7, but there is
> a
> > disturbance in the force that suggests that it might be worth looking
> > forward to the year 2020 when Python itself will drop support for 2.7.
> There
> > is also a website, http://www.python3statement.org, where several
> projects
> > in the scientific python stack have pledged to be Python 2.7 free by that
> > date.  Given that, a preliminary discussion of the subject might be
> > interesting, if only to gather information of where the community
> currently
> > stands.
>
> One reasonable position would that numpy releases that happen while
> 2.7 is supported upstream will also support 2.7, and releases after
> that won't.
>
> From numpy's perspective, I feel like the most important reason to
> continue supporting 2.7 is our ability to convince people to keep
> upgrading. (Not the only reason, but the most important.) What I mean
> is: if we dropped 2.7 support tomorrow then it wouldn't actually make
> numpy unavailable on python 2.7; it would just mean that lots of users
> stayed at 1.12 indefinitely. Which is awkward, but it wouldn't be the
> end of the world – numpy is mature software and 1.12 works pretty
> well. The big problem IMO would be if this then meant that lots of
> downstream projects felt that they had to continue supporting 1.12
> going forward, which makes it very difficult for us to effectively
> ship new features or even bug fixes – I mean, we can ship them, but
> no-one will use them. And if a downstream project finds a bug in numpy
> and can't upgrade numpy, then the tendency is to work around it
> instead of reporting it upstream. I think this is the main thing we
> want to avoid.
>

+1


>
> This kind of means that we're at the mercy of downstream projects,
> though – if scipy/pandas/etc. decide they want to support 2.7 until
> 2022, it might be in our best interest to do the same. But there's a
> collective action problem here: we want to keep supporting 2.7 so long
> as they do, but at the same time they may feel they need to keep
> supporting 2.7 as long as we do. And all of us would prefer to drop
> 2.7 support sooner rather than later, but we might all get stuck
>
because we're waiting for someone else to move first.
>

I don't quite agree about being stuck. These kind of upgrades should and
usually do go top of stack to bottom. Something like Jupyter which is
mostly an end user tool goes first (they announced 2020 quite a while ago),
domain specific packages go at a similar time, then scipy & co, and only
after that numpy. Cython will be even later I'm sure - it still supports
Python 2.6.


>
> So my suggestion would be that numpy make some official announcement
> that our plan is to drop support for python 2 immediately after
> cpython upstream does.


Not quite sure CPython schedule is relevant - important bug fixes haven't
been making it into 2.7 for a very long time now, so the only change is the
rare security patch.


> If worst comes to worst we can always decide to
> extend it at the time... but if we make the announcement now, then
> it's less likely that we'll need to :-).
>

I'd be in favor of putting out a schedule in coordination with
scipy/pandas/etc, but it probably should look more like
- 2020: what's on http://www.python3statement.org/ now
- 2021: scipy / pandas / scikit-learn / etc.
- 2022: numpy

Ralf


> Another interesting project to look at here is django, since they
> occupy a similar place in the ecosystem (e.g. last I checked numpy and
> django are the two most-imported python packages on github):
> https://www.djangoproject.com/weblog/2015/jun/25/roadmap/
> Their approach isn't directly applicable, because unlike us they have
> a strict time-based release schedule, defined support period for each
> release, and a distinction between regular and long-term support
> releases, where regular releases act sort of like
> pre-releases-on-steroids for the next LTS release. But basically what
> they settled on is philosophically similar to what I'm suggesting:
> they don't want an LTS to be supporting 2.7 beyond when cpython is
> supporting it. Then on top of that they don't want to support 2.7 in
> the regular releases leading up to that LTS either, so the net effect
> is that their last release with 2.7 support came out last week, and it
> will be supported until 2020 :-). And another useful precedent I think
> is that they announced this two years ago, back in 2015; if we make an
> announcement now, we'll be be giving a similar amount of warning.
>
> -n
>
> --
> Nathaniel J. Smith -- https://vorpus.org
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
___
NumPy-Discussion ma

Re: [Numpy-discussion] Long term plans for dropping Python 2.7

2017-04-15 Thread Ralf Gommers
On Sat, Apr 15, 2017 at 7:02 PM, Nathaniel Smith  wrote:

> On Fri, Apr 14, 2017 at 10:47 PM, Ralf Gommers 
> wrote:
> >
> >
> > On Sat, Apr 15, 2017 at 5:19 PM, Nathaniel Smith  wrote:
> [...]
> >> From numpy's perspective, I feel like the most important reason to
> >> continue supporting 2.7 is our ability to convince people to keep
> >> upgrading. (Not the only reason, but the most important.) What I mean
> >> is: if we dropped 2.7 support tomorrow then it wouldn't actually make
> >> numpy unavailable on python 2.7; it would just mean that lots of users
> >> stayed at 1.12 indefinitely. Which is awkward, but it wouldn't be the
> >> end of the world – numpy is mature software and 1.12 works pretty
> >> well. The big problem IMO would be if this then meant that lots of
> >> downstream projects felt that they had to continue supporting 1.12
> >> going forward, which makes it very difficult for us to effectively
> >> ship new features or even bug fixes – I mean, we can ship them, but
> >> no-one will use them. And if a downstream project finds a bug in numpy
> >> and can't upgrade numpy, then the tendency is to work around it
> >> instead of reporting it upstream. I think this is the main thing we
> >> want to avoid.
> >
> >
> > +1
> >
> >>
> >>
> >> This kind of means that we're at the mercy of downstream projects,
> >> though – if scipy/pandas/etc. decide they want to support 2.7 until
> >> 2022, it might be in our best interest to do the same. But there's a
> >> collective action problem here: we want to keep supporting 2.7 so long
> >> as they do, but at the same time they may feel they need to keep
> >> supporting 2.7 as long as we do. And all of us would prefer to drop
> >> 2.7 support sooner rather than later, but we might all get stuck
> >>
> >> because we're waiting for someone else to move first.
> >
> >
> > I don't quite agree about being stuck. These kind of upgrades should and
> > usually do go top of stack to bottom. Something like Jupyter which is
> mostly
> > an end user tool goes first (they announced 2020 quite a while ago),
> domain
> > specific packages go at a similar time, then scipy & co, and only after
> that
> > numpy. Cython will be even later I'm sure - it still supports Python 2.6.
>
> To make sure we're on the same page about what "2020" means here: the
> latest release of IPython is 5.0, which came out in July last year.
> This is the last release that supports py2; they dropped support for
> py2 in master months ago, and 6.0 (whose schedule has been slipping,
> but I think should be out Any Time Now?) won't support py2. Their plan
> is to keep backporting bug fixes to 5.x until the end of 2017; after
> that the core team won't support py2 at all. And they've also
> announced that if volunteers want to step up to maintain 5.x after
> that, then they're willing to keep accepting pull requests until July
> 2019.
>
> Refs:
>   https://blog.jupyter.org/2016/07/08/ipython-5-0-released/
>   https://github.com/jupyter/roadmap/blob/master/accepted/migr
> ation-to-python-3-only.md
>
> I suspect that in practice that "end of 2017" date will the
> end-of-support date for most intents and purposes. And for numpy with
> its vaguely defined support periods, I think it makes most sense to
> talk in terms of release dates;


agreed, release dates makes sense. we don't want to be doing some kind of
LTS scheme.


> so if we want to compare
> apples-to-apples, my suggestion is that numpy drops py2 support in
> 2020 and in that sense ipython dropped py2 support in july last year.
>
> >>
> >> So my suggestion would be that numpy make some official announcement
> >> that our plan is to drop support for python 2 immediately after
> >> cpython upstream does.
> >
> >
> > Not quite sure CPython schedule is relevant - important bug fixes haven't
> > been making it into 2.7 for a very long time now, so the only change is
> the
> > rare security patch.
>
> Huh? 2.7 gets tons of changes: https://github.com/python/cpyt
> hon/commits/2.7


You're right. My experience is ending up on bugs.python.org when debugging
and the answer to "can this be backported to 2.7" usually being no - but it
looks like my experience is skewed by distutils, which is not exactly well
maintained.


> Officially CPython has 2 modes for releases: "regular support" and
> "security fixes only". 2.7 is s

Re: [Numpy-discussion] Relaxed stride checking fixup

2017-04-20 Thread Ralf Gommers
On Thu, Apr 20, 2017 at 6:28 AM, Charles R Harris  wrote:

> Hi All,
>
> Currently numpy master has a bogus stride that will cause an error when
> downstream projects misuse it. That is done in order to help smoke out
> errors. Previously that bogus stride has been fixed up for releases, but
> that requires a special patch to be applied after each version branch is
> made. At this point I'd like to pick one or the other option and make the
> development and release branches the same in this regard. The question is:
> which option to choose? Keeping the fixup in master will remove some code
> and keep things simple, while not fixing up the release will possibly lead
> to more folks finding errors. At this point in time I am favoring applying
> the fixup in master.
>
> Thoughts?
>

If we have to pick then keeping the fixup sounds reasonable. Would there be
value in making the behavior configurable at compile time? If there are
more such things and they'd be behind a __NUMPY_DEBUG__ switch, then people
may want to test that in their own CI.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] np.diff on boolean arrays now raises

2017-06-15 Thread Ralf Gommers
On Thu, Jun 15, 2017 at 7:08 PM, Jaime Fernández del Río <
jaime.f...@gmail.com> wrote:

> There is an ongoing discussion on github:
>
> https://github.com/numpy/numpy/issues/9251
>
> In 1.13 np.diff has started raising on boolean arrays, since subtraction
> of  boolean arrays is now deprecated.
>
> A decision has to be made whether:
>
>- raising an error is the correct thing to do, and only the docstring
>needs updating, or
>- backwards compatibility is more important and diff should still work
>on boolean arrays.
>
>
The issue is bigger than np.diff. For example, there's a problem with the
scipy.ndimage morphology functions (
https://github.com/scipy/scipy/issues/7493) that looks pretty serious. All
ndimage.binary_* functions (7 of them) for example return boolean arrays,
and chaining those is now broken. Unfortunately it seems that that wasn't
covered by the ndimage test suite.

Possibly reverting the breaking change in 1.13.1 is the best way to fix
this.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] ANN: SciPy 0.19.1 release

2017-06-23 Thread Ralf Gommers
On behalf of the Scipy development team I am pleased to announce the
availability of Scipy 0.19.1. This is a bugfix-only release, no new
features are included.

This release requires Python 2.7 or 3.4-3.6 and NumPy 1.8.2 or greater.
Source tarballs and release notes can be found at
https://github.com/scipy/scipy/releases/tag/v0.19.1.
OS X and Linux wheels are available from PyPI:
https://pypi.python.org/pypi/scipy/0.19.1

Thanks to everyone who contributed!

Cheers,
Ralf



-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

==
SciPy 0.19.1 Release Notes
==

SciPy 0.19.1 is a bug-fix release with no new features compared to 0.19.0.
The most important change is a fix for a severe memory leak in
``integrate.quad``.


Authors
===

* Evgeni Burovski
* Patrick Callier +
* Yu Feng
* Ralf Gommers
* Ilhan Polat
* Eric Quintero
* Scott Sievert
* Pauli Virtanen
* Warren Weckesser

A total of 9 people contributed to this release.
People with a "+" by their names contributed a patch for the first time.
This list of names is automatically generated, and may not be fully
complete.


Issues closed for 0.19.1
- 

- - `#7214 <https://github.com/scipy/scipy/issues/7214>`__: Memory use in
integrate.quad in scipy-0.19.0
- - `#7258 <https://github.com/scipy/scipy/issues/7258>`__:
``linalg.matrix_balance`` gives wrong transformation matrix
- - `#7262 <https://github.com/scipy/scipy/issues/7262>`__: Segfault in
daily testing
- - `#7273 <https://github.com/scipy/scipy/issues/7273>`__:
``scipy.interpolate._bspl.evaluate_spline`` gets wrong type
- - `#7335 <https://github.com/scipy/scipy/issues/7335>`__:
scipy.signal.dlti(A,B,C,D).freqresp() fails


Pull requests for 0.19.1
- 

- - `#7211 <https://github.com/scipy/scipy/pull/7211>`__: BUG: convolve may
yield inconsistent dtypes with method changed
- - `#7216 <https://github.com/scipy/scipy/pull/7216>`__: BUG: integrate:
fix refcounting bug in quad()
- - `#7229 <https://github.com/scipy/scipy/pull/7229>`__: MAINT: special:
Rewrite a test of wrightomega
- - `#7261 <https://github.com/scipy/scipy/pull/7261>`__: FIX: Corrected
the transformation matrix permutation
- - `#7265 <https://github.com/scipy/scipy/pull/7265>`__: BUG: Fix broken
axis handling in spectral functions
- - `#7266 <https://github.com/scipy/scipy/pull/7266>`__: FIX 7262: ckdtree
crashes in query_knn.
- - `#7279 <https://github.com/scipy/scipy/pull/7279>`__: Upcast half- and
single-precision floats to doubles in BSpline...
- - `#7336 <https://github.com/scipy/scipy/pull/7336>`__: BUG: Fix
signal.dfreqresp for StateSpace systems
- - `#7419 <https://github.com/scipy/scipy/pull/7419>`__: Fix several
issues in ``sparse.load_npz``, ``save_npz``
- - `#7420 <https://github.com/scipy/scipy/pull/7420>`__: BUG: stats: allow
integers as kappa4 shape parameters


Checksums
=

MD5
~~~

72415e8da753eea97eb9820602931cb5
scipy-0.19.1-cp27-cp27m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
e0022540df2735eb0475071b266d5d71
scipy-0.19.1-cp27-cp27m-manylinux1_i686.whl
f513eb4ea2086de169a502df7efb91c7
scipy-0.19.1-cp27-cp27m-manylinux1_x86_64.whl
906c3c59209d6249b5d8ce14cfa01382
scipy-0.19.1-cp27-cp27mu-manylinux1_i686.whl
afbf8ffb4a4fe7c18e34cb8a313c18ee
scipy-0.19.1-cp27-cp27mu-manylinux1_x86_64.whl
5ba945b3404644244ab469883a1723f0
scipy-0.19.1-cp34-cp34m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
9c02cdd79e4ffadddcce7b2212039816
scipy-0.19.1-cp34-cp34m-manylinux1_i686.whl
79c0ba3618466614744de9a2f5362bbc
scipy-0.19.1-cp34-cp34m-manylinux1_x86_64.whl
602a741a54190e16698ff8b2fe9fd27c
scipy-0.19.1-cp35-cp35m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
d6c2ecadd4df36eb61870227fae42d3a
scipy-0.19.1-cp35-cp35m-manylinux1_i686.whl
e7167c0a9cf270f89437e2fd09731636
scipy-0.19.1-cp35-cp35m-manylinux1_x86_64.whl
fc2e4679e83590ff19c1a5c5b1aa4786
scipy-0.19.1-cp36-cp36m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
458615e9a56429a72038531dd5dcb3cb
scipy-0.19.1-cp36-cp36m-manylinux1_i686.whl
65b1667ac56861da4cbc609960ed735b
scipy-0.19.1-cp36-cp36m-manylinux1_x86_64.whl
b704ebe9a28b8fe83d9f238d40031266  scipy-0.19.1.tar.gz
cad6bac0638b176f72c00fe81ed54d19  scipy-0.19.1.tar.xz
eb69261e5026ef2f3b9ae827caa7e5b8  scipy-0.19.1.zip

SHA256
~~

1e8fedf602859b541ebae78667ccfc53158edef58d9ee19ee659309004565952
scipy-0.19.1-cp27-cp27m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
023ee29faa76c184a607e21076f097dc32f3abba7c71ece374588f95920aa993
scipy-0.19.1-cp27-cp27m-manylinux1_i686.whl
2a26d06a642e3c9107ca06df125f5dc5507abe2b87fd7865415d03ab654b0b43
scipy-0.19.1-cp27-cp27m-manylinux1_

Re: [Numpy-discussion] proposed changes to array printing in 1.14

2017-06-30 Thread Ralf Gommers
On Sat, Jul 1, 2017 at 7:04 AM, CJ Carey  wrote:

> Is it feasible/desirable to provide a doctest runner that ignores
> whitespace?
>

Yes, and yes. Due to doctest being in the stdlib that is going to take
forever to have any effect though; a separate our-sane-doctest module would
be the way to ship this I think.

And not only whitespace, also provide sane floating point comparison
behavior (AstroPy has something for that that can be reused:
https://github.com/astropy/astropy/issues/6312) as well as things a bit
more specific to the needs of scientific Python projects like ignoring the
hashes in returned matplotlib objects.


> That would allow downstream projects to fix their doctests on 1.14+ with a
> one-line change, without breaking tests on 1.13.
>

It's worth reading https://docs.python.org/2/library/doctest.html#soapbox.
At least the first 2 paragraphs; the rest is mainly an illustration of why
doctest default behavior is evil ("doctest also makes an excellent tool for
regression testing" - eh, no). The only valid reason nowadays to use
doctests is to test that doc examples run and are correct. None of
{whitespace, blank lines, small floating point differences between
platforms/libs, hashes} are valid reasons to get a test failure.

At the moment there's no polished alternative to using stdlib doctest, so
I'm sympathetic to the argument of "this causes a lot of work". On the
other hand, exact repr's are not part of the NumPy (or Python for that
matter) backwards compatibility guarantees. So imho we should provide that
alternative to doctest, and then no longer worry about these kinds of
changes and just make them.

Until we have that alternative, I think
https://github.com/scipy/scipy/blob/master/tools/refguide_check.py may be
useful to other projects - it checks that your examples are not broken,
without doing the detailed string comparisons that are so fragile.

Ralf



>
> On Fri, Jun 30, 2017 at 11:11 AM, Allan Haldane 
> wrote:
>
>> On 06/30/2017 03:55 AM, Juan Nunez-Iglesias wrote:
>>
>>> To reiterate my point on a previous thread, I don't think this should
>>> happen until NumPy 2.0. This *will* break a massive number of doctests, and
>>> what's worse, it will do so in a way that makes it difficult to support
>>> doctesting for both 1.13 and 1.14. I don't see a big enough benefit to
>>> these changes to justify breaking everyone's tests before an API-breaking
>>> version bump.
>>>
>>
>> I am still on the fence about exactly how annoying this change would be,
>> and it is is good to hear whether this affects you and how badly.
>>
>> Yes, someone would have to spend an hour removing a hundred spaces in
>> doctests, and the 1.13 to 1.14 period is trickier (but virtualenv helps).
>> But none of your end users are going to have their scripts break, there are
>> no new warnings or exceptions.
>>
>> A followup questions is, to what degree can we compromise? Would it be
>> acceptable to skip the big change #1, but keep the other 3 changes? I
>> expect they affect far fewer doctests. Or, for instance, I could scale back
>> #1 so it only affects size-1 (or perhaps, only size-0) arrays. What amount
>> of change would be OK, and how is changing a small number of doctests
>> different from changing more?
>>
>> Also, let me clarify the motivations for the changes. As Marten noted,
>> change #2 is what motivated all the other changes. Currently 0d arrays
>> print in their own special way which was making it very hard to implement
>> fixes to voidtype str/repr, and the datetime and other 0d reprs are weird.
>> The fix is to make 0d arrays print using the same code-path as higher-d
>> ndarrays, but then we ended up with reprs like "array( 1.)" because of the
>> space for the sign position. So I removed the space from the sign position
>> for all float arrays. But as I noted I probably could remove it for only
>> size-1 or 0d arrays and still fix my problem, even though I think it might
>> be pretty hacky to implement in the numpy code.
>>
>> Allan
>>
>>
>>
>>
>>
>>> On 30 Jun 2017, 6:42 AM +1000, Marten van Kerkwijk <
>>> m.h.vankerkw...@gmail.com>, wrote:
>>>
 To add to Allan's message: point (2), the printing of 0-d arrays, is
 the one that is the most important in the sense that it rectifies a
 really strange situation, where the printing cannot be logically
 controlled by the same mechanism that controls >=1-d arrays (see PR).

 While point 3 can also be considered a bug fix, 1 & 4 are at some
 level matters of taste; my own reason for supporting their
 implementation now is that the 0-d arrays already forces me (or,
 specifically, astropy) to rewrite quite a few doctests, and I'd rather
 have everything in one go -- in this respect, it is a pity that this
 is separate from the earlier change in printing for structured arrays
 (which was also much for the better, but broke a lot of doctests).

 -- Marten



 On Thu, Jun 29, 2017 at 3:

Re: [Numpy-discussion] Scipy 2017 NumPy sprint

2017-07-01 Thread Ralf Gommers
On Fri, Jun 30, 2017 at 6:50 AM, Pauli Virtanen  wrote:

> Charles R Harris kirjoitti 29.06.2017 klo 20:45:
> > Here's a random idea: how about building a NumPy gallery?
> > scikit-{image,learn} has it, and while those projects may have more
> > visual datasets, I can imagine something along the lines of Nicolas
> > Rougier's beautiful book:
> >
> > http://www.labri.fr/perso/nrougier/from-python-to-numpy/
> > 
> >
> >
> > So that would be added in the  numpy
> > /numpy.org
> >  repo?
>
> Or https://scipy-cookbook.readthedocs.io/  ?
> (maybe minus bitrot and images added :)
> _


I'd like the numpy.org one. numpy.org is now incredibly sparse and ugly, a
gallery would make it look a lot better.

Another idea, from the "deprecate np.matrix" discussion: add numpy
documentation describing the preferred way to handle matrices, extolling
the virtues of @, and move np.matrix documentation to a deprecated section.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Scipy 2017 NumPy sprint

2017-07-05 Thread Ralf Gommers
On Mon, Jul 3, 2017 at 7:01 AM, Charles R Harris 
wrote:

>
>
> On Sun, Jul 2, 2017 at 9:33 AM, Sebastian Berg  > wrote:
>
>> On Sun, 2017-07-02 at 10:49 -0400, Allan Haldane wrote:
>> > On 07/02/2017 10:03 AM, Charles R Harris wrote:
>> > > Updated list below.
>> > >
>> > > On Sat, Jul 1, 2017 at 7:08 PM, Benjamin Root > > >
>> > > <mailto:ben.v.r...@gmail.com>> wrote:
>> > >
>> > > Just a heads-up. There is now a sphinx-gallery plugin.
>> > > Matplotlib
>> > > and a few other projects have migrated their docs over to use
>> > > it.
>> > >
>> > > https://sphinx-gallery.readthedocs.io/en/latest/
>> > > <https://sphinx-gallery.readthedocs.io/en/latest/>
>> > >
>> > > Cheers!
>> > > Ben Root
>> > >
>> > >
>> > > On Sat, Jul 1, 2017 at 7:12 AM, Ralf Gommers > > > l.com
>> > > <mailto:ralf.gomm...@gmail.com>> wrote:
>> > >
>> > >
>> > >
>> > > On Fri, Jun 30, 2017 at 6:50 AM, Pauli Virtanen > > > <mailto:p...@iki.fi>> wrote:
>> > >
>> > > Charles R Harris kirjoitti 29.06.2017 klo 20:45:
>> > > > Here's a random idea: how about building a NumPy
>> > > gallery?
>> > > > scikit-{image,learn} has it, and while those
>> > > projects may have more
>> > > > visual datasets, I can imagine something along
>> > > the lines of Nicolas
>> > > > Rougier's beautiful book:
>> > > >
>> > > > http://www.labri.fr/perso/nrougier/from-python-to
>> > > -numpy/
>> > > <http://www.labri.fr/perso/nrougier/from-python-to-nump
>> > > y/>
>> > > > <http://www.labri.fr/perso/nrougier/from-python-t
>> > > o-numpy/
>> > > <http://www.labri.fr/perso/nrougier/from-python-to-nump
>> > > y/>>
>> > > >
>> > > >
>> > > > So that would be added in the  numpy
>> > > > <https://github.com/numpy>/numpy.org
>> > > <http://numpy.org>
>> > > > <https://github.com/numpy/numpy.org
>> > > <https://github.com/numpy/numpy.org>> repo?
>> > >
>> > > Or https://scipy-cookbook.readthedocs.io/
>> > > <https://scipy-cookbook.readthedocs.io/>  ?
>> > > (maybe minus bitrot and images added :)
>> > > _
>> > >
>> > >
>> > > I'd like the numpy.org <http://numpy.org> one. numpy.org
>> > > <http://numpy.org> is now incredibly sparse and ugly, a
>> > > gallery
>> > > would make it look a lot better.
>> > >
>> > > Another idea, from the "deprecate np.matrix" discussion:
>> > > add
>> > > numpy documentation describing the preferred way to handle
>> > > matrices, extolling the virtues of @, and move np.matrix
>> > > documentation to a deprecated section.
>> > >
>> > >
>> > >   Putting things together with a few new ideas,
>> > >
>> > >  1. add gallery to numpy.org <http://numpy.org>,
>> > >  2. add extended documentation of '@' operator,
>> > >  3. make Numpy tests Pytest compatible,
>> > >  4. add matrix multiplication ufunc.
>> > >
>> > >   Any more ideas?
>> >
>> > The new doctest runner suggested in the printing thread? This is to
>> > ignore whitespace and precision in ndarray output.
>> >
>> > I can see an argument for distributing it in numpy if it is designed
>> > to
>> > be specially aware of ndarrays or numpy scalars (eg to test equality
>> > between 'wants' and 'got')
>> >
>>
>> I don't really feel it is very numpy specific or should be under the
>> numpy umbrella (I mean if there is no other spot, I guess it could live
>> on the numpy github page). Its about as numpy specific, as the gallery
>> sphinx extension is probably

Re: [Numpy-discussion] Scipy 2017 NumPy sprint

2017-07-05 Thread Ralf Gommers
On Wed, Jul 5, 2017 at 10:14 PM, Peter Cock 
wrote:

> Note that TravisCI does not yet have official Python support on Mac OS X,
>
> https://github.com/travis-ci/travis-ci/issues/2312
>
> I believe it is possible to do anyway by faking it under another setting
> (e.g. pretend to be a generic language build, and use the system Python
> or install your own specific version of Python as needed), so that may be
> worth trying during a sprint.
>

That approach has worked reliably for
https://github.com/MacPython/numpy-wheels for a while now, so should be
straightforward.

Ralf



> Peter
>
> On Wed, Jul 5, 2017 at 10:43 AM, Ralf Gommers 
> wrote:
> >
> > Better platform test coverage would be a useful topic if someone is
> willing
> > to work on that. NumPy needs OS X testing enabled on TravisCI, SciPy
> needs
> > OS X and a 32-bit test (steal from NumPy). And if someone really feels
> > ambitious: replace ATLAS by OpenBLAS in one of the test matrix entries.
> >
> > Ralf
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] pytest and degrees of separation.

2017-07-12 Thread Ralf Gommers
On Wed, Jul 12, 2017 at 11:06 AM, Chris Barker 
wrote:

>
>
> On Tue, Jul 11, 2017 at 5:04 PM, Thomas Caswell 
> wrote:
>
>> Going with option 2 is probably the best option so that you can use
>> pytest fixtures and parameterization.
>>
>
> I agree -- those are worth a lot!
>

Maybe I'm dense, but I don't quite see the difference between 1 and 2. Test
files should never be imported unless tests are run, they're not part of
any public API nor do they currently have __init__.py files.

Ralf



>
> -CHB
>
>
>
>> Might be worth looking at how Matplotlib re-arranged things on our master
>> branch to maintain back-compatibility with nose-specific tools that were
>> used by down-stream projects.
>>
>> Tom
>>
>> On Tue, Jul 11, 2017 at 4:22 PM Sebastian Berg <
>> sebast...@sipsolutions.net> wrote:
>>
>>> On Tue, 2017-07-11 at 14:49 -0600, Charles R Harris wrote:
>>> > Hi All,
>>> >
>>> > Just looking for opinions and feedback on the need to keep NumPy from
>>> > having a hard nose/pytest dependency. The options as I see them are:
>>> >
>>> > pytest is never imported until the tests are run -- current practice
>>> > with nose
>>> > pytest is never imported unless the testfiles are imported -- what I
>>> > would like
>>> > pytest is imported together when numpy is -- what we need to avoid.
>>> > Currently the approach has been 1), but I think 2) makes more sense
>>> > and allows more flexibility.
>>>
>>>
>>> I am not quite sure about everything here. My guess is we can do
>>> whatever we want when it comes to our own tests, and I don't mind just
>>> switching everything to pytest (I for one am happy as long as I can run
>>> `runtests.py` ;)).
>>> When it comes to the utils we provide, those should keep working
>>> without nose/pytest if they worked before without it I think.
>>>
>>> My guess is that all your options do that, so I think we should take
>>> the one that gives the nicest maintainable code :). Though can't say I
>>> looked enough into it to really make a well educated decision, that
>>> probably means your option 2.
>>>
>>> - Sebastian
>>>
>>>
>>>
>>> > Thoughts?
>>> > Chuck
>>> > ___
>>> > NumPy-Discussion mailing list
>>> > NumPy-Discussion@python.org
>>> > https://mail.python.org/mailman/listinfo/numpy-discussion___
>>> 
>>> NumPy-Discussion mailing list
>>> NumPy-Discussion@python.org
>>> https://mail.python.org/mailman/listinfo/numpy-discussion
>>>
>>
>> ___
>> NumPy-Discussion mailing list
>> NumPy-Discussion@python.org
>> https://mail.python.org/mailman/listinfo/numpy-discussion
>>
>>
>
>
> --
>
> Christopher Barker, Ph.D.
> Oceanographer
>
> Emergency Response Division
> NOAA/NOS/OR&R(206) 526-6959   voice
> 7600 Sand Point Way NE   (206) 526-6329   fax
> Seattle, WA  98115   (206) 526-6317   main reception
>
> chris.bar...@noaa.gov
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] pytest and degrees of separation.

2017-07-13 Thread Ralf Gommers
On Thu, Jul 13, 2017 at 8:14 AM, Pauli Virtanen  wrote:

> Charles R Harris kirjoitti 12.07.2017 klo 13:53:
> > In practice, that would generally be true, but the nose testing tools
> > were 1, all nose imports were buried in functions that ran during
> > testing. Whether or not that was by intent I don't know. But having an
> > explicit consensus on 2, which seems to be the case here, is helpful
> > because it allows better use of pytest fixtures.
>
> I guess the question is about shipping new pytest fixtures as a part of
> the public API of numpy.testing, for use by 3rd party projects.
>

Agreed. That's a different question, and I'd prefer to keep things as they
are in that respect. Otherwise it's basically a hard dependency of numpy
itself on pytest.


> If the issue is only with Numpy's own tests, they can import stuff from
> a private submodule that's not imported by "import numpy.testing", so it
> does not introduce a dependency.
>
> (Similar thing for the public API might also be possible e.g. "import
> numpy.testing.pytest_fixtures" but it comes at the cost of a new
> submodule.)
>
> So I guess a main question actually is: how much of the public API in
> numpy.testing should be ported to pytest for use by 3rd projects?
>
> The numerical assert functions are obviously useful.
>
> The warnings suppression (pytest warning stuff IIRC doesn't deal with
> warning registries nor work around the bugs in warnings.catch_warnings)
> similarly --- it could make sense to actually upstream it...
>

> But I'm not so clear about the rest.
>

Agreed, nothing in the decorators that obviously needs a pytest-based
implementation. The Tester class may be the one thing.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Fwd: [NumFOCUS Projects] Important: 2017 NumFOCUS Summit & Sustainability Workshop

2017-07-20 Thread Ralf Gommers
Hi all,

This is a public service announcement: for the first time NumFOCUS is
organising a sustainability workshop, details below. NumPy is able to send
two representatives; the NumPy steering committee chose Chuck and
Nathaniel. I'll be present as well as NumFOCUS board member. Last year
there was a NumFOCUS summit as well, but no sustainability workshop. It
looks like going forward there will be one a year of these. If there's more
interest than tickets next year we'll look at rotating the NumPy
representatives.

Cheers,
Ralf



-- Forwarded message --
From: Christie Koehler 
Date: Thu, Jun 29, 2017 at 10:52 AM
Subject: [NumFOCUS Projects] Important: 2017 NumFOCUS Summit &
Sustainability Workshop
To: "proje...@numfocus.org" 


Dear Project Leads,

Following up from our “save the date” note a few weeks ago, here is our
formal invitation to the 2017 NumFOCUS Summit & Sustainability Workshop
taking place in *Austin, Texas on 10-11 October, 2017*.

All participants planning to attend should register
 no later than July
19th, 2017.

New at this year’s Summit is our first ever Sustainability Workshop.

The goal of the Workshop is to guide project leads and core contributors in
identifying an appropriate sustainability plan for their project as well as
the initial steps required to start implementing that plan.

We would like to have at least 1-2 representatives from each NumFOCUS
sponsored project attend the workshop. Members of the NumFOCUS Board of
Directors, Advisory Council, and Sustainability Advisory Board who would
like to attend are welcome to do so.

Project Leads are tasked with selecting which project representative(s)
should attend the Summit/Workshop, according to their governance practices.
Once you’ve selected your reps, you may simply forward this email to them
and ask them to register. You do not need to otherwise inform NumFOCUS of
your choice prior.

--
N.B. — FYI, NumFOCUS is hoping to arrange an event for the Austin data
science community to benefit from the presence of so many core maintainers
of the scientific computing stack. This would take place immediately prior
to the NumFOCUS Summit.

No details are available yet — if we succeed in securing a venue, NumFOCUS
staff will be in touch to invite your participation. We just wanted to give
you a friendly heads-up as you consider your travel plans. Direct questions
about this event to i...@numfocus.org.

--

FAQ (Please review before registering):

What am I committing to if I register for the Summit/Workshop?

Those attending the Summit/Workshop should plan to attend all day on the
10th and 11th. You can view the tentative schedule for the Summit/Workshop
here
.


Additionally, each project sending a representative to the Summit should
commit to preparing and delivering a 5-minute lightning talk about the
state of their project. NumFOCUS staff will be available to help with these
presentations.

Who do you recommend we send to represent our project?

While we recognize that people have multiple roles, we recommend sending 1
person who can represent the technical aspects of your project (such as a
person who develops or maintains code) and 1 person who can represent the
community/business aspects (such as a person who works with the
user/developer community).

Representatives can be paid or unpaid contributors, but should be
sufficiently invested and involved in your project such that they will both
want to and are able to continue the sustainability work started at the
Workshop.

Can we send more than 2 reps from our project?

Maybe. Our travel budget is based on two people from each project. That
number is also what we are basing our venue, catering, and other logistical
arrangements upon.

Once we start collecting registrations, however, we may find that not all
projects can send two representatives and/or that some projects can cover
the cost of travel for one or more of their reps.

So, a max of two registrations for each project will be available and
thereafter anyone from your project wanting to attend will be given the
option of being added to the waitlist. Because registrations are
“first-come, first-served” you should make sure your first two choices of
representative register before subsequent ones.

Are you covering travel expenses for project reps? For those traveling from
outside the United States, too?

Yes. We have a budget for reimbursing participants for travel-related
costs, even those needing to travel from outside the United States. To
ensure the best use of these funds, we’ll ask you when you register if you
need NumFOCUS to reimburse you for travel.

For all participants, we will handle hotel reservations and payments. We
will book hotel accommodations for all participants for three ni

[Numpy-discussion] NumPy steering councils members

2017-07-20 Thread Ralf Gommers
Hi all,

It has been well over a year since we put together the governance structure
and steering council (
https://docs.scipy.org/doc/numpy-dev/dev/governance/people.html#governance-people).
We haven't reviewed the people on the steering council in that time. Based
on the criteria for membership I would like to make the following
suggestion (note, not discussed with everyone in private beforehand):

Adding the following people to the steering council:
- Eric Wieser
- Marten van Kerkwijk
- Stephan Hoyer
- Allan Haldane

Removing the following people from the steering council due to inactivity:
- Alex Griffing

Note that I've tried to contact Alex directly before, but he has not
replied and has 0 activity on GitHub for the last year. I will try once
more though.

Thoughts?

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] F2PY problems with PGI compilers

2017-08-11 Thread Ralf Gommers
On Sat, Aug 5, 2017 at 7:24 AM, Jeff Layton  wrote:

> Good afternoon!
>
> I'm trying to build a Python module using F2PY on a simple Fortran code
> using the PGI 17.4 community compilers.
>
> I'm using Conda 4.3.21 with Python 2.7.13 and F2PY 2. The command line I'm
> using is,
>
>
> f2py --compiler=pg --fcompiler=pg -c -m mdevice mdevice.f90
>
>
> The output from f2py is at the end of the email. Any suggestions are
> greatly appreciated.
>

--compiler=pg seems wrong, that specifies the C/C++ compiler to use not the
Fortran compiler. Hence you get the error "don't know how to compile C/C++
code on platform 'posix' with 'pg' compiler". Try just leaving that off
(thereby using the default C compiler you have installed, probably gcc).

Ralf




> Thanks!
>
> Jeff
>
>
> Output from f2py:
>
>
>
> running build
> running config_cc
> unifing config_cc, config, build_clib, build_ext, build commands
> --compiler options
> running config_fc
> unifing config_fc, config, build_clib, build_ext, build commands
> --fcompiler options
> running build_src
> build_src
> building extension "mdevice" sources
> f2py options: []
> f2py:> /tmp/tmptN1fdp/src.linux-x86_64-2.7/mdevicemodule.c
> creating /tmp/tmptN1fdp/src.linux-x86_64-2.7
> Reading fortran codes...
> Reading file 'mdevice.f90' (format:free)
> Post-processing...
> Block: mdevice
> Block: devicequery
> In: :mdevice:mdevice.f90:devicequery
> get_useparameters: no module cudafor info used by devicequery
> Post-processing (stage 2)...
> Building modules...
> Building module "mdevice"...
> Constructing wrapper function "devicequery"...
>   devicequery()
> Wrote C/API module "mdevice" to file "/tmp/tmptN1fdp/src.linux-x86_
> 64-2.7/mdevicemodule.c"
>   adding '/tmp/tmptN1fdp/src.linux-x86_64-2.7/fortranobject.c' to sources.
>   adding '/tmp/tmptN1fdp/src.linux-x86_64-2.7' to include_dirs.
> copying 
> /home/laytonjb/anaconda2/lib/python2.7/site-packages/numpy/f2py/src/fortranobject.c
> -> /tmp/tmptN1fdp/src.linux-x86_64-2.7
> copying 
> /home/laytonjb/anaconda2/lib/python2.7/site-packages/numpy/f2py/src/fortranobject.h
> -> /tmp/tmptN1fdp/src.linux-x86_64-2.7
> build_src: building npy-pkg config files
> running build_ext
> error: don't know how to compile C/C++ code on platform 'posix' with 'pg'
> compiler
>
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] F2PY problems with PGI compilers

2017-08-14 Thread Ralf Gommers
On Mon, Aug 14, 2017 at 7:24 AM, Jeff Layton  wrote:

> +SciPy list
>
>
>
> On Sat, Aug 5, 2017 at 7:24 AM, Jeff Layton  wrote:
>
>> Good afternoon!
>>
>> I'm trying to build a Python module using F2PY on a simple Fortran code
>> using the PGI 17.4 community compilers.
>>
>> I'm using Conda 4.3.21 with Python 2.7.13 and F2PY 2. The command line
>> I'm using is,
>>
>>
>> f2py --compiler=pg --fcompiler=pg -c -m mdevice mdevice.f90
>>
>>
>> The output from f2py is at the end of the email. Any suggestions are
>> greatly appreciated.
>>
>
> --compiler=pg seems wrong, that specifies the C/C++ compiler to use not
> the Fortran compiler. Hence you get the error "don't know how to compile
> C/C++ code on platform 'posix' with 'pg' compiler". Try just leaving that
> off (thereby using the default C compiler you have installed, probably gcc).
>
>
> Ralf - thanks for the response! I had tried that before and F2PY still
> thinks it's using the PGI C compiler:
>
>
> running build
> running config_cc
> unifing config_cc, config, build_clib, build_ext, build commands
> --compiler options
> running config_fc
> unifing config_fc, config, build_clib, build_ext, build commands
> --fcompiler options
> running build_src
> build_src
> building extension "mdevice" sources
> f2py options: []
> f2py:> /tmp/tmpkxCUbk/src.linux-x86_64-2.7/mdevicemodule.c
> creating /tmp/tmpkxCUbk/src.linux-x86_64-2.7
> Reading fortran codes...
> Reading file 'mdevice.f90' (format:free)
> Post-processing...
> Block: mdevice
> Block: devicequery
> In: :mdevice:mdevice.f90:devicequery
> get_useparameters: no module cudafor info used by devicequery
> Post-processing (stage 2)...
> Building modules...
> Building module "mdevice"...
> Constructing wrapper function "devicequery"...
>   devicequery()
> Wrote C/API module "mdevice" to file "/tmp/tmpkxCUbk/src.linux-x86_
> 64-2.7/mdevicemodule.c"
>   adding '/tmp/tmpkxCUbk/src.linux-x86_64-2.7/fortranobject.c' to sources.
>   adding '/tmp/tmpkxCUbk/src.linux-x86_64-2.7' to include_dirs.
> copying 
> /home/laytonjb/anaconda2/lib/python2.7/site-packages/numpy/f2py/src/fortranobject.c
> -> /tmp/tmpkxCUbk/src.linux-x86_64-2.7
> copying 
> /home/laytonjb/anaconda2/lib/python2.7/site-packages/numpy/f2py/src/fortranobject.h
> -> /tmp/tmpkxCUbk/src.linux-x86_64-2.7
> build_src: building npy-pkg config files
> running build_ext
> customize UnixCCompiler
> customize UnixCCompiler using build_ext
> customize PGroupFCompiler
> Found executable /opt/pgi/linux86-64/pgidir/pgf90
> Found executable /opt/pgi/linux86-64/pgidir/pgf77
> Found executable /opt/pgi/linux86-64/17.4/bin/pgfortran
> customize PGroupFCompiler using build_ext
> building 'mdevice' extension
> compiling C sources
> C compiler: /opt/pgi/linux86-64/pgidir/pgcc -fno-strict-aliasing -g -O2
> -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC
>
> creating /tmp/tmpkxCUbk/tmp
> creating /tmp/tmpkxCUbk/tmp/tmpkxCUbk
> creating /tmp/tmpkxCUbk/tmp/tmpkxCUbk/src.linux-x86_64-2.7
> compile options: '-I/tmp/tmpkxCUbk/src.linux-x86_64-2.7
> -I/home/laytonjb/anaconda2/lib/python2.7/site-packages/numpy/core/include
> -I/home/laytonjb/anaconda2/include/python2.7 -c'
> pgcc: /tmp/tmpkxCUbk/src.linux-x86_64-2.7/mdevicemodule.c
> pgcc-Error-Unknown switch: -fno-strict-aliasing
> pgcc-Error-Unknown switch: -fwrapv
> pgcc-Error-Unknown switch: -Wall
> pgcc-Error-Unknown switch: -Wstrict-prototypes
> pgcc-Error-Unknown switch: -fno-strict-aliasing
> pgcc-Error-Unknown switch: -fwrapv
> pgcc-Error-Unknown switch: -Wall
> pgcc-Error-Unknown switch: -Wstrict-prototypes
> error: Command "/opt/pgi/linux86-64/pgidir/pgcc -fno-strict-aliasing -g
> -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC
> -I/tmp/tmpkxCUbk/src.linux-x86_64-2.7 -I/home/laytonjb/anaconda2/
> lib/python2.7/site-packages/numpy/core/include 
> -I/home/laytonjb/anaconda2/include/python2.7
> -c /tmp/tmpkxCUbk/src.linux-x86_64-2.7/mdevicemodule.c -o
> /tmp/tmpkxCUbk/tmp/tmpkxCUbk/src.linux-x86_64-2.7/mdevicemodule.o" failed
> with exit status 1
>
>
>
> I'm definitely at a lose here. I have no idea how to make F2PY work with
> the PGI compilers. I'm beginning to think F2PY is completely borked unless
> you use the defaults (gcc).
>

That's not the case. Here is an example when using the Intel Fortran
compiler together with either MSVC or Intel C compilers:
https://software.intel.com/en-us/articles/building-numpyscipy-with-intel-mkl-and-intel-fortran-on-windows

I notice there that in all cases the C compiler is explicitly specified.
Did you also try ``--compiler=gcc --fcompiler=pg``?

Also, I'm not sure how often this is done with f2py directly; I've only
ever used the --fcompiler flag via ``python setup.py config
--fcompiler=..``, invoking f2py under the hood. It could be that doing
this directly is indeed broken (or was never supported in the first place).

Ralf



>
> Thanks!
>
> Jeff
>
>
>
>
>
>
>> Thanks!
>>
>> J

Re: [Numpy-discussion] F2PY problems with PGI compilers

2017-08-14 Thread Ralf Gommers
On Tue, Aug 15, 2017 at 2:19 AM, Jeff Layton  wrote:

> On 08/14/2017 03:51 AM, Ralf Gommers wrote:
>
>
>
>
>
>>
>>
>> I'm definitely at a lose here. I have no idea how to make F2PY work with
>> the PGI compilers. I'm beginning to think F2PY is completely borked unless
>> you use the defaults (gcc).
>>
>
> That's not the case. Here is an example when using the Intel Fortran
> compiler together with either MSVC or Intel C compilers:
> https://software.intel.com/en-us/articles/building-
> numpyscipy-with-intel-mkl-and-intel-fortran-on-windows
>
> I notice there that in all cases the C compiler is explicitly specified.
> Did you also try ``--compiler=gcc --fcompiler=pg``?
>
> Also, I'm not sure how often this is done with f2py directly; I've only
> ever used the --fcompiler flag via ``python setup.py config
> --fcompiler=..``, invoking f2py under the hood. It could be that doing
> this directly is indeed broken (or was never supported in the first place).
>
> Ralf
>
>
>
> Point taken. I don't use Windows too much and I don't use the Intel
> compiler any more (it's not free for non-commercial use :)  ).
>
> I tried using "--compiler=gcc --fcompiler=pg" and I get the same answer at
> the very end.
>
>
> running build_ext
> error: don't know how to compile C/C++ code on platform 'posix' with 'gcc'
> compiler
>
>
> Good point about f2py. I'm using the Anaconda distribution of f2py and
> that may have limitations with respect to the PGI compiler. I may download
> the f2py source and build it to include PGI support. Maybe that will fix
> the problem.
>

That won't make a difference, all the build config code is pure Python.
Anaconda will give you the same results as building from source.

Ralf




> Thanks!
>
> Jeff
>
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] NumPy default citation

2017-09-08 Thread Ralf Gommers
On Wed, Sep 6, 2017 at 8:29 AM, Stefan van der Walt 
wrote:

> On Tue, Sep 5, 2017, at 13:25, Charles R Harris wrote:
>
>
> On Tue, Sep 5, 2017 at 12:36 PM, Stefan van der Walt  > wrote:
>
> Shall we add a citation to Travis's "Guide to NumPy (2nd ed.)" on both
>
>
> What is the citation for?
>
>
> It's the suggested reference to add to your paper, if you use the NumPy
> package in your work.
>

+1 for changing the recommended citation to Guide to NumPy now.

I do think that we're kind of wasting those citations though. I'm not an
academic, but for those contributors who are, citations of a paper that is
indexed (counts towards h-index etc.) can be very important. So probably we
should find the time to write a paper, with Travis still as first author
but with all core devs & major contributors on it.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Dropping support for Accelerate

2017-09-15 Thread Ralf Gommers
On Sat, Jul 22, 2017 at 10:50 PM, Ilhan Polat  wrote:

> A few months ago, I had the innocent intention to wrap LDLt decomposition
> routines of LAPACK into SciPy but then I am made aware that the minimum
> required version of LAPACK/BLAS was due to Accelerate framework. Since then
> I've been following the core SciPy team and others' discussion on this
> issue.
>
> We have been exchanging opinions for quite a while now within various
> SciPy issues and PRs about the ever-increasing Accelerate-related issues
> and I've compiled a brief summary about the ongoing discussions to reduce
> the clutter.
>
> First, I would like to kindly invite everyone to contribute and sharpen
> the cases presented here
>
> https://github.com/scipy/scipy/wiki/Dropping-support-for-Accelerate
>
> The reason I specifically wanted to post this also in NumPy mailing list
> is to probe for the situation from the NumPy-Accelerate perspective. Is
> there any NumPy specific problem that would indirectly effect SciPy should
> the support for Accelerate is dropped?
>

An update on this: discussion on https://github.com/scipy/scipy/pull/6051
has mostly converged, and we're about to decide to start requiring a higher
LAPACK version (after 1.0, no changes for the next release). Looks like
that'll be LAPACK 3.4.0 for now.

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] ANN: SciPy 1.0 beta release

2017-09-17 Thread Ralf Gommers
Hi all,

I'm excited to be able to announce the availability of the first beta
release of Scipy 1.0. This is a big release, and a version number that has
been 16 years in the making. It contains a few more deprecations and
backwards incompatible changes than an average release. Therefore please do
test it on your own code, and report any issues on the Github issue tracker
or on the scipy-dev mailing list.

Sources: https://github.com/scipy/scipy/releases/tag/v1.0.0b1
Binary wheels: will follow tomorrow, I'll announce those when ready
(TravisCI is under maintenance right now)

Thanks to everyone who contributed to this release!

Ralf




Release notes (full notes including authors, closed issued and merged PRs
at the Github Releases link above):

==
SciPy 1.0.0 Release Notes
==

.. note:: Scipy 1.0.0 is not released yet!

.. contents::

SciPy 1.0.0 is the culmination of 8 months of hard work. It contains
many new features, numerous bug-fixes, improved test coverage and
better documentation.  There have been a number of deprecations and
API changes in this release, which are documented below.  All users
are encouraged to upgrade to this release, as there are a large number
of bug-fixes and optimizations.  Moreover, our development attention
will now shift to bug-fix releases on the 1.0.x branch, and on adding
new features on the master branch.

Some of the highlights of this release are:

- Major build improvements.  Windows wheels are available on PyPI for the
  first time, and continuous integration has been set up on Windows and OS X
  in addition to Linux.
- A set of new ODE solvers and a unified interface to them
  (`scipy.integrate.solve_ivp`).
- Two new trust region optimizers and a new linear programming method, with
  improved performance compared to what `scipy.optimize` offered previously.
- Many new BLAS and LAPACK functions were wrapped.  The BLAS wrappers are
now
  complete.

This release requires Python 2.7 or 3.4+ and NumPy 1.8.2 or greater.

This is also the last release to support LAPACK 3.1.x - 3.3.x.  Moving the
lowest supported LAPACK version to >3.2.x was long blocked by Apple
Accelerate
providing the LAPACK 3.2.1 API.  We have decided that it's time to either
drop
Accelerate or, if there is enough interest, provide shims for functions
added
in more recent LAPACK versions so it can still be used.


New features


`scipy.cluster` improvements


`scipy.cluster.hierarchy.optimal_leaf_ordering`, a function to reorder a
linkage matrix to minimize distances between adjacent leaves, was added.


`scipy.fftpack` improvements


N-dimensional versions of the discrete sine and cosine transforms and their
inverses were added as ``dctn``, ``idctn``, ``dstn`` and ``idstn``.


`scipy.integrate` improvements
--

A set of new ODE solvers have been added to `scipy.integrate`.  The
convenience
function `scipy.integrate.solve_ivp` allows uniform access to all solvers.
The individual solvers (``RK23``, ``RK45``, ``Radau``, ``BDF`` and
``LSODA``)
can also be used directly.


`scipy.linalg` improvements


The BLAS wrappers in `scipy.linalg.blas` have been completed.  Added
functions
are ``*gbmv``, ``*hbmv``, ``*hpmv``, ``*hpr``, ``*hpr2``, ``*spmv``,
``*spr``,
``*tbmv``, ``*tbsv``, ``*tpmv``, ``*tpsv``, ``*trsm``, ``*trsv``, ``*sbmv``,
``*spr2``,

Wrappers for the LAPACK functions ``*gels``, ``*stev``, ``*sytrd``,
``*hetrd``,
``*sytf2``, ``*hetrf``, ``*sytrf``, ``*sycon``, ``*hecon``, ``*gglse``,
``*stebz``, ``*stemr``, ``*sterf``, and ``*stein`` have been added.

The function `scipy.linalg.subspace_angles` has been added to compute the
subspace angles between two matrices.

The function `scipy.linalg.clarkson_woodruff_transform` has been added.
It finds low-rank matrix approximation via the Clarkson-Woodruff Transform.

The functions `scipy.linalg.eigh_tridiagonal` and
`scipy.linalg.eigvalsh_tridiagonal`, which find the eigenvalues and
eigenvectors of tridiagonal hermitian/symmetric matrices, were added.


`scipy.ndimage` improvements


Support for homogeneous coordinate transforms has been added to
`scipy.ndimage.affine_transform`.

The ``ndimage`` C code underwent a significant refactoring, and is now
a lot easier to understand and maintain.


`scipy.optimize` improvements
-

The methods ``trust-region-exact`` and ``trust-krylov`` have been added to
the
function `scipy.optimize.minimize`. These new trust-region methods solve the
subproblem with higher accuracy at the cost of more Hessian factorizations
(compared to dogleg) or more matrix vector products (compared to ncg) but
usually require less nonlinear iterations and are able to deal with
indefinite
Hessians. They seem very competitive against the other Newton methods
implemented in scipy.

`scipy.optimize.linprog` gained an inte

Re: [Numpy-discussion] ANN: SciPy 1.0 beta release

2017-09-18 Thread Ralf Gommers
On Mon, Sep 18, 2017 at 3:12 AM, Thomas Caswell  wrote:

> It seems major versions are in the air!
>
> For matplotlib 2.0 we put together http://matplotlib.
> org/users/dflt_style_changes.html for the style changes which shows the
> new behavior, the old behavior, and how to get the old behavior back.
>

We certainly didn't make that many backwards incompatible changes (very few
in fact, mostly removing long deprecated code), but yes - we'll do
something more than the regular announcement email for the final 1.0
release.

Ralf


>
> Tom
>
> On Sun, Sep 17, 2017 at 10:48 AM Ilhan Polat  wrote:
>
>> Well also thank you Ralf, for going through all those issues one by one
>> from all kinds of topics. Must be really painstakingly tedious.
>>
>>
>> On Sun, Sep 17, 2017 at 12:48 PM, Ralf Gommers 
>> wrote:
>>
>>> Hi all,
>>>
>>> I'm excited to be able to announce the availability of the first beta
>>> release of Scipy 1.0. This is a big release, and a version number that
>>> has been 16 years in the making. It contains a few more deprecations and
>>> backwards incompatible changes than an average release. Therefore please do
>>> test it on your own code, and report any issues on the Github issue tracker
>>> or on the scipy-dev mailing list.
>>>
>>> Sources: https://github.com/scipy/scipy/releases/tag/v1.0.0b1
>>> Binary wheels: will follow tomorrow, I'll announce those when ready
>>> (TravisCI is under maintenance right now)
>>>
>>> Thanks to everyone who contributed to this release!
>>>
>>> Ralf
>>>
>>>
>>>
>>>
>>> Release notes (full notes including authors, closed issued and merged
>>> PRs at the Github Releases link above):
>>>
>>> [snip]
>>>
>>
>>>
>>> ___
>>> NumPy-Discussion mailing list
>>> NumPy-Discussion@python.org
>>> https://mail.python.org/mailman/listinfo/numpy-discussion
>>>
>>> ___
>> NumPy-Discussion mailing list
>> NumPy-Discussion@python.org
>> https://mail.python.org/mailman/listinfo/numpy-discussion
>>
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] ANN: SciPy 1.0 beta release

2017-09-18 Thread Ralf Gommers
On Sun, Sep 17, 2017 at 10:48 PM, Ralf Gommers 
wrote:

> Hi all,
>
> I'm excited to be able to announce the availability of the first beta
> release of Scipy 1.0. This is a big release, and a version number that
> has been 16 years in the making. It contains a few more deprecations and
> backwards incompatible changes than an average release. Therefore please do
> test it on your own code, and report any issues on the Github issue tracker
> or on the scipy-dev mailing list.
>
> Sources: https://github.com/scipy/scipy/releases/tag/v1.0.0b1
> Binary wheels: will follow tomorrow, I'll announce those when ready
> (TravisCI is under maintenance right now)
>

Binary wheels for Windows, Linux and OS X (for all supported Python
versions, 32-bit and 64-bit) can be found at http://wheels.scipy.org. To
install directly with pip:

pip install scipy=='1.0.0b1' -f http://wheels.scipy.org --trusted-host
wheels.scipy.org

(add --user and/or --upgrade as required to that command). Alternatively,
just download the wheel you need and do `pip install
scipy-1.0.0b1-.whl`.

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [SciPy-User] ANN: SciPy 1.0 beta release

2017-09-19 Thread Ralf Gommers
On Mon, Sep 18, 2017 at 10:36 PM, Matthew Brett 
wrote:

> Hi,
>
> On Mon, Sep 18, 2017 at 11:14 AM, Ralf Gommers 
> wrote:
> >
> >
> > On Mon, Sep 18, 2017 at 10:11 PM, Matthew Brett  >
> > wrote:
> >>
> >> Hi,
> >>
> >> On Mon, Sep 18, 2017 at 11:07 AM, Thomas Kluyver 
> wrote:
> >> > On 18 September 2017 at 10:59, Ralf Gommers 
> >> > wrote:
> >> >>
> >> >> Binary wheels for Windows, Linux and OS X (for all supported Python
> >> >> versions, 32-bit and 64-bit) can be found at http://wheels.scipy.org
> .
> >> >> To
> >> >> install directly with pip:
> >> >>
> >> >> pip install scipy=='1.0.0b1' -f http://wheels.scipy.org
> >> >> --trusted-host
> >> >> wheels.scipy.org
> >> >
> >> >
> >> > I don't want to criticise the hard work that has gone into making this
> >> > available, but I'm disappointed that we're telling people to install
> >> > software over an insecure HTTP connection.
> >>
> >> I personally prefer the following recipe:
> >>
> >> pip install -f
> >> https://3f23b170c54c2533c070-1c8a9b3114517dc5fe17b7c3f8c63a4
> 3.ssl.cf2.rackcdn.com
> >> scipy=='1.0.0b1'
> >>
> >> > Can the wheels not be uploaded to PyPI?
> >>
> >> Sounds like a good idea.  I can do that - any objections?
> >
> >
> > That would be helpful Matthew, I'm about to sign off for today.
>
> Done - new instructions for testing:
>
> pip install --pre --upgrade scipy
>

Thanks Matthew! Replying to all lists with the better install instructions.

Cheers,
Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Proposal - change to OpenBLAS for Windows wheels

2017-09-25 Thread Ralf Gommers
On Tue, Sep 26, 2017 at 6:48 AM, Nathaniel Smith  wrote:

> Makes sense to me.
>
> On Sep 25, 2017 05:54, "Matthew Brett"  wrote:
>
>> Hi,
>>
>> I suggest we switch from ATLAS to OpenBLAS for our Windows wheels:
>>
>> * OpenBLAS is much faster, at least when Tony Kelman tested it last year
>> [1];
>> * We now have an automated Appveyor build for OpenBLAS [2, 3];
>> * Tests are passing with 32-bit and 64-bit wheels [4];
>> * The next Scipy release will have OpenBLAS wheels;
>>
>> Any objections / questions / alternatives?
>>
>
+1

Ralf


>> Cheers,
>>
>> Matthew
>>
>> [1] https://github.com/numpy/numpy/issues/5479#issuecomment-185033668
>> [2] https://github.com/matthew-brett/build-openblas
>> [3] https://ci.appveyor.com/project/matthew-brett/build-openblas
>> [4] https://ci.appveyor.com/project/matthew-brett/numpy-wheels/
>> build/1.0.50
>> ___
>> NumPy-Discussion mailing list
>> NumPy-Discussion@python.org
>> https://mail.python.org/mailman/listinfo/numpy-discussion
>>
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] ANN: first SciPy 1.0.0 release candidate

2017-09-27 Thread Ralf Gommers
Hi all,

I'm excited to be able to announce the availability of the first release
candidate of Scipy 1.0. This is a big release, and a version number that
has been 16 years in the making. It contains a few more deprecations and
backwards incompatible changes than an average release. Therefore please do
test it on your own code, and report any issues on the Github issue tracker
or on the scipy-dev mailing list.

Sources and binary wheels can be found at https://pypi.python.org/pypi/scipy
and https://github.com/scipy/scipy/releases/tag/v1.0.0rc1. To install with
pip:

pip install --pre --upgrade scipy

Thanks to everyone who contributed to this release!

Ralf



Pull requests merged after v1.0.0b1:

- `#7876 `__: GEN: Add comments
to the tests for clarification
- `#7891 `__: ENH: backport #7879
to 1.0.x
- `#7902 `__: MAINT: signal: Make
freqz handling of multidim. arrays match...
- `#7905 `__: REV: restore
wminkowski
- `#7908 `__: FIX: Avoid bad
``__del__`` (close) behavior
- `#7918 `__: TST: mark two
optimize.linprog tests as xfail. See gh-7877.
- `#7929 `__: MAINT: changed
defaults to lower in sytf2, sytrf and hetrf
- `#7938 `__: MAINT: backports
from 1.0.x
- `#7939 `__: Fix umfpack solver
construction for win-amd64




==
SciPy 1.0.0 Release Notes
==

.. note:: Scipy 1.0.0 is not released yet!

.. contents::

SciPy 1.0.0 is the culmination of 8 months of hard work. It contains
many new features, numerous bug-fixes, improved test coverage and
better documentation.  There have been a number of deprecations and
API changes in this release, which are documented below.  All users
are encouraged to upgrade to this release, as there are a large number
of bug-fixes and optimizations.  Moreover, our development attention
will now shift to bug-fix releases on the 1.0.x branch, and on adding
new features on the master branch.

Some of the highlights of this release are:

- Major build improvements.  Windows wheels are available on PyPI for the
  first time, and continuous integration has been set up on Windows and OS X
  in addition to Linux.
- A set of new ODE solvers and a unified interface to them
  (`scipy.integrate.solve_ivp`).
- Two new trust region optimizers and a new linear programming method, with
  improved performance compared to what `scipy.optimize` offered previously.
- Many new BLAS and LAPACK functions were wrapped.  The BLAS wrappers are
now
  complete.

This release requires Python 2.7 or 3.4+ and NumPy 1.8.2 or greater.

This is also the last release to support LAPACK 3.1.x - 3.3.x.  Moving the
lowest supported LAPACK version to >3.2.x was long blocked by Apple
Accelerate
providing the LAPACK 3.2.1 API.  We have decided that it's time to either
drop
Accelerate or, if there is enough interest, provide shims for functions
added
in more recent LAPACK versions so it can still be used.


New features


`scipy.cluster` improvements


`scipy.cluster.hierarchy.optimal_leaf_ordering`, a function to reorder a
linkage matrix to minimize distances between adjacent leaves, was added.


`scipy.fftpack` improvements


N-dimensional versions of the discrete sine and cosine transforms and their
inverses were added as ``dctn``, ``idctn``, ``dstn`` and ``idstn``.


`scipy.integrate` improvements
--

A set of new ODE solvers have been added to `scipy.integrate`.  The
convenience
function `scipy.integrate.solve_ivp` allows uniform access to all solvers.
The individual solvers (``RK23``, ``RK45``, ``Radau``, ``BDF`` and
``LSODA``)
can also be used directly.


`scipy.linalg` improvements


The BLAS wrappers in `scipy.linalg.blas` have been completed.  Added
functions
are ``*gbmv``, ``*hbmv``, ``*hpmv``, ``*hpr``, ``*hpr2``, ``*spmv``,
``*spr``,
``*tbmv``, ``*tbsv``, ``*tpmv``, ``*tpsv``, ``*trsm``, ``*trsv``, ``*sbmv``,
``*spr2``,

Wrappers for the LAPACK functions ``*gels``, ``*stev``, ``*sytrd``,
``*hetrd``,
``*sytf2``, ``*hetrf``, ``*sytrf``, ``*sycon``, ``*hecon``, ``*gglse``,
``*stebz``, ``*stemr``, ``*sterf``, and ``*stein`` have been added.

The function `scipy.linalg.subspace_angles` has been added to compute the
subspace angles between two matrices.

The function `scipy.linalg.clarkson_woodruff_transform` has been added.
It finds low-rank matrix approximation via the Clarkson-Woodruff Transform.

The functions `scipy.linalg.eigh_tridiagonal` and
`scipy.linalg.eigvalsh_tridiagonal`, which find the eigenvalues and
eigenvectors of tridiagonal hermitian/symmetric matrice

Re: [Numpy-discussion] Sustainability

2017-10-04 Thread Ralf Gommers
On Thu, Oct 5, 2017 at 4:03 AM, Benjamin Root  wrote:

> One thing that concerns me is trying to keep up with demand. Our tools
> have become extremely popular, but it is very difficult for maintainers to
> keep up with this demand. So, we seem to have a tendency to "tribalize", in
> a sense, focusing on the demand for our respective pet projects. Various
> projects have created excellent tools for better managing the high demand
> such as circleci doc build views, back-porting bots, lint checking
> settings, vim/emacs settings, etc. These are important tools to help
> maintainers, and other projects need to know about them.
>

Yes, this is a great topic. We informally share these kinds of tools and
techniques between projects, but there's no central place for any of them
nor docs other than "read my code and yml config files".


> Perhaps these dev tools should get centrally managed? Maybe we should have
> a "developer's conference"? It would be good to learn from others their
> techniques and workflows that make them so efficient. I swear, some of you
> have time-turners or cloning machines!
>
> Cheers!
> Ben Root
>
>
> On Wed, Oct 4, 2017 at 10:42 AM, Joseph Fox-Rabinovitz <
> jfoxrabinov...@gmail.com> wrote:
>
>> Could you elaborate on the purpose of the meeting, or perhaps point to
>> a link with a description if there is one? Sustainability is a very
>> broad topic. What do you plan on discussing?
>>
>
It's going to be a broad workshop, anything from dev tools to finding new
maintainers, the role of community managers, and obtaining funding is in
scope. Part of the preparation for organizing the workshop was interviews
with a core developer from every project.

I'd be interested in the replies to Chuck's question to get a sense of what
the community thinks are NumPy's key challenges to remain (or become ...) a
sustainable project in the years to come.

Ralf



>
>> -Joe
>>
>> On Tue, Oct 3, 2017 at 7:04 PM, Charles R Harris
>>  wrote:
>> > Hi All,
>> >
>> > I and a number of others representing various open source projects
>> under the
>> > NumFocus umbrella will be attending as meeting next Tuesday do discuss
>> the
>> > problem of sustainability. In preparation for that meeting I would be
>> > interested in any ideas that the folks who follow this list may have on
>> the
>> > subject.
>> >
>> > Chuck
>> >
>> > ___
>> > NumPy-Discussion mailing list
>> > NumPy-Discussion@python.org
>> > https://mail.python.org/mailman/listinfo/numpy-discussion
>> >
>> ___
>> NumPy-Discussion mailing list
>> NumPy-Discussion@python.org
>> https://mail.python.org/mailman/listinfo/numpy-discussion
>>
>
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [SciPy-Dev] Sustainability

2017-10-05 Thread Ralf Gommers
On Fri, Oct 6, 2017 at 8:24 AM, Pauli Virtanen  wrote:

> to, 2017-10-05 kello 00:08 +0200, Ilhan Polat kirjoitti:
> [clip]
> > 2. Feature completeness of basic modules.
>
> I think those judgments can be subjective, but it is true many things
> have been organically grown, and this becomes even more so as the
> number of contributors grows (fielding other people's PRs already takes
> a lot of time). I think most people send something that they need there
> and then, and it is not possible to tell them to go do something else
> instead. So even if there are many contributors, it's less clear how
> available they are for implementing "the plan".
>
> Regardless, I would recommend anyone who knows something obvious is
> missing that obviously should be included, to send a PR adding it to
> the roadmap:
>
> https://github.com/scipy/scipy/blob/master/doc/ROADMAP.rst.txt
>
> This document has not really been updated significantly since the
> single brainstorm in a Scipy conference many years ago, and we did not
> go through the contents then in great detail.
>

I did update it several times, removing things that were implemented or no
longer relevant. However you're right that especially on the new features
front it could use more inputs.

Ralf



> Pauli
>
> ___
> SciPy-Dev mailing list
> scipy-...@python.org
> https://mail.python.org/mailman/listinfo/scipy-dev
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] ANN: second SciPy 1.0.0 release candidate

2017-10-18 Thread Ralf Gommers
Hi all,

I'm excited to be able to announce the availability of the second (and
hopefully last) release candidate of Scipy 1.0. This is a big release, and
a version number that has been 16 years in the making. It contains a few
more deprecations and backwards incompatible changes than an average
release. Therefore please do test it on your own code, and report any
issues on the Github issue tracker or on the scipy-dev mailing list.

Sources and binary wheels can be found at https://pypi.python.org/pypi/scipy
and https://github.com/scipy/scipy/releases/tag/v1.0.0rc2. To install with
pip:

pip install --pre --upgrade scipy

The most important issues fixed after v1.0.0rc1 is
https://github.com/scipy/scipy/issues/7969 (missing DLL in Windows wheel).

Pull requests merged after v1.0.0rc1:

- `#7948 `__: DOC: add note on
checking for deprecations before upgrade to...
- `#7952 `__: DOC: update SciPy
Roadmap for 1.0 release and recent discussions.
- `#7960 `__: BUG: optimize:
revert changes to bfgs in gh-7165
- `#7962 `__: TST: special: mark
a failing hyp2f1 test as xfail
- `#7973 `__: BUG: fixed keyword
in 'info' in ``_get_mem_available`` utility
- `#7986 `__: TST: Relax
test_trsm precision to 5 decimals
- `#8001 `__: TST: fix test
failures from Matplotlib 2.1 update
- `#8010 `__: BUG: signal: fix
crash in lfilter
- `#8019 `__: MAINT: fix test
failures with NumPy master

Thanks to everyone who contributed to this release!

Ralf



==
SciPy 1.0.0 Release Notes
==

.. note:: Scipy 1.0.0 is not released yet!

.. contents::

SciPy 1.0.0 is the culmination of 8 months of hard work. It contains
many new features, numerous bug-fixes, improved test coverage and
better documentation.  There have been a number of deprecations and
API changes in this release, which are documented below.  All users
are encouraged to upgrade to this release, as there are a large number
of bug-fixes and optimizations.  Moreover, our development attention
will now shift to bug-fix releases on the 1.0.x branch, and on adding
new features on the master branch.

Some of the highlights of this release are:

- Major build improvements.  Windows wheels are available on PyPI for the
  first time, and continuous integration has been set up on Windows and OS X
  in addition to Linux.
- A set of new ODE solvers and a unified interface to them
  (`scipy.integrate.solve_ivp`).
- Two new trust region optimizers and a new linear programming method, with
  improved performance compared to what `scipy.optimize` offered previously.
- Many new BLAS and LAPACK functions were wrapped.  The BLAS wrappers are
now
  complete.

This release requires Python 2.7 or 3.4+ and NumPy 1.8.2 or greater.

This is also the last release to support LAPACK 3.1.x - 3.3.x.  Moving the
lowest supported LAPACK version to >3.2.x was long blocked by Apple
Accelerate
providing the LAPACK 3.2.1 API.  We have decided that it's time to either
drop
Accelerate or, if there is enough interest, provide shims for functions
added
in more recent LAPACK versions so it can still be used.


New features


`scipy.cluster` improvements


`scipy.cluster.hierarchy.optimal_leaf_ordering`, a function to reorder a
linkage matrix to minimize distances between adjacent leaves, was added.


`scipy.fftpack` improvements


N-dimensional versions of the discrete sine and cosine transforms and their
inverses were added as ``dctn``, ``idctn``, ``dstn`` and ``idstn``.


`scipy.integrate` improvements
--

A set of new ODE solvers have been added to `scipy.integrate`.  The
convenience
function `scipy.integrate.solve_ivp` allows uniform access to all solvers.
The individual solvers (``RK23``, ``RK45``, ``Radau``, ``BDF`` and
``LSODA``)
can also be used directly.


`scipy.linalg` improvements


The BLAS wrappers in `scipy.linalg.blas` have been completed.  Added
functions
are ``*gbmv``, ``*hbmv``, ``*hpmv``, ``*hpr``, ``*hpr2``, ``*spmv``,
``*spr``,
``*tbmv``, ``*tbsv``, ``*tpmv``, ``*tpsv``, ``*trsm``, ``*trsv``, ``*sbmv``,
``*spr2``,

Wrappers for the LAPACK functions ``*gels``, ``*stev``, ``*sytrd``,
``*hetrd``,
``*sytf2``, ``*hetrf``, ``*sytrf``, ``*sycon``, ``*hecon``, ``*gglse``,
``*stebz``, ``*stemr``, ``*sterf``, and ``*stein`` have been added.

The function `scipy.linalg.subspace_angles` has been added to compute the
subspace angles between two matrices.

The function `scipy.linalg.clarkson_woodruff_transform` has been added.
It finds low-rank matrix approximation via

Re: [Numpy-discussion] numpy grant update

2017-10-19 Thread Ralf Gommers
On Thu, Oct 19, 2017 at 10:02 AM, Charles R Harris <
charlesr.har...@gmail.com> wrote:

>
>
> On Wed, Oct 18, 2017 at 11:24 PM, Nathaniel Smith  wrote:
>
>> Hi all,
>>
>> I wanted to give everyone an update on what's going on with the NumPy
>> grant [1]. As you may have noticed, things have been moving a bit
>> slower than originally hoped -- unfortunately my health is improving
>> but has continued to be rocky [2].
>>
>> Fortunately, I have awesome co-workers, and BIDS has an institutional
>> interest/mandate for figuring out how to make these things happen, so
>> after thinking it over we've decided to reorganize how we're doing
>> things internally and split up the work to let me focus on the core
>> technical/community aspects without getting overloaded. Specifically,
>> Fernando Pérez and Jonathan Dugan [3] are taking on PI/administration
>> duties, Stéfan van der Walt will focus on handling day-to-day
>> management of the incoming hires, and Nelle Varoquaux & Jarrod Millman
>> will also be joining the team (exact details TBD).
>>
>> This shouldn't really affect any of you, except that you might see
>> some familiar faces with @berkeley.edu emails becoming more engaged.
>> I'm still leading the Berkeley effort, and in any case it's still
>> ultimately the community and NumPy steering council who will be making
>> decisions about the project – this is just some internal details about
>> how we're planning to manage our contributions. But in the interest of
>> full transparency I figured I'd let you know what's happening.
>>
>> In other news, the job ad to start the official hiring process has now
>> been submitted for HR review, so it should hopefully be up soon --
>> depending on how efficient the bureaucracy is. I'll definitely let
>> everyone know as soon as its posted.
>>
>> I'll also be giving a lunch talk at BIDS tomorrow to let folks locally
>> know about what's going on, which I think will be recorded – I'll send
>> around a link after in case others are interested.
>>
>> -n
>>
>> [1] https://mail.python.org/pipermail/numpy-discussion/2017-May/
>> 076818.html
>> [2] https://vorpus.org/blog/emerging-from-the-underworld/
>> [3] https://bids.berkeley.edu/people/jonathan-dugan
>>
>
> Thanks for the update.
>

Thanks Nathaniel. I'm looking forward to all of those people getting
involved. Hiring always takes longer than you want, but next year the pace
of development promises to pick up significantly:)

Ralf



> Chuck
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] SciPy 1.0 released!

2017-10-25 Thread Ralf Gommers
Hi all,

We are extremely pleased to announce the release of SciPy 1.0, 16 years
after
version 0.1 saw the light of day.  It has been a long, productive journey to
get here, and we anticipate many more exciting new features and releases in
the
future.


Why 1.0 now?


A version number should reflect the maturity of a project - and SciPy was a
mature and stable library that is heavily used in production settings for a
long time already.  From that perspective, the 1.0 version number is long
overdue.

Some key project goals, both technical (e.g. Windows wheels and continuous
integration) and organisational (a governance structure, code of conduct
and a
roadmap), have been achieved recently.

Many of us are a bit perfectionist, and therefore are reluctant to call
something "1.0" because it may imply that it's "finished" or "we are 100%
happy
with it".  This is normal for many open source projects, however that
doesn't
make it right.  We acknowledge to ourselves that it's not perfect, and there
are some dusty corners left (that will probably always be the case).
Despite
that, SciPy is extremely useful to its users, on average has high quality
code
and documentation, and gives the stability and backwards compatibility
guarantees that a 1.0 label imply.


Some history and perspectives
-

- 2001: the first SciPy release
- 2005: transition to NumPy
- 2007: creation of scikits
- 2008: scipy.spatial module and first Cython code added
- 2010: moving to a 6-monthly release cycle
- 2011: SciPy development moves to GitHub
- 2011: Python 3 support
- 2012: adding a sparse graph module and unified optimization interface
- 2012: removal of scipy.maxentropy
- 2013: continuous integration with TravisCI
- 2015: adding Cython interface for BLAS/LAPACK and a benchmark suite
- 2017: adding a unified C API with scipy.LowLevelCallable; removal of
scipy.weave
- 2017: SciPy 1.0 release


**Pauli Virtanen** is SciPy's Benevolent Dictator For Life (BDFL).  He says:

*Truthfully speaking, we could have released a SciPy 1.0 a long time ago,
so I'm
happy we do it now at long last. The project has a long history, and during
the
years it has matured also as a software project.  I believe it has well
proved
its merit to warrant a version number starting with unity.*

*Since its conception 15+ years ago, SciPy has largely been written by and
for
scientists, to provide a box of basic tools that they need. Over time, the
set
of people active in its development has undergone some rotation, and we have
evolved towards a somewhat more systematic approach to development.
Regardless,
this underlying drive has stayed the same, and I think it will also continue
propelling the project forward in future. This is all good, since not long
after 1.0 comes 1.1.*

**Travis Oliphant** is one of SciPy's creators.  He says:

*I'm honored to write a note of congratulations to the SciPy developers and
the
entire SciPy community for the release of SciPy 1.0.   This release
represents
a dream of many that has been patiently pursued by a stalwart group of
pioneers
for nearly 2 decades.   Efforts have been broad and consistent over that
time
from many hundreds of people.   From initial discussions to efforts coding
and
packaging to documentation efforts to extensive conference and community
building, the SciPy effort has been a global phenomenon that it has been a
privilege to participate in.*

*The idea of SciPy was already in multiple people’s minds in 1997 when I
first
joined the Python community as a young graduate student who had just fallen
in
love with the expressibility and extensibility of Python.   The internet was
just starting to bringing together like-minded mathematicians and
scientists in
nascent electronically-connected communities.   In 1998, there was a
concerted
discussion on the matrix-SIG, python mailing list with people like Paul
Barrett, Joe Harrington, Perry Greenfield, Paul Dubois, Konrad Hinsen, David
Ascher, and others.   This discussion encouraged me in 1998 and 1999 to
procrastinate my PhD and spend a lot of time writing extension modules to
Python that mostly wrapped battle-tested Fortran and C-code making it
available
to the Python user.   This work attracted the help of others like Robert
Kern,
Pearu Peterson and Eric Jones who joined their efforts with mine in 2000 so
that by 2001, the first SciPy release was ready.   This was long before
Github
simplified collaboration and input from others and the "patch" command and
email was how you helped a project improve.*

*Since that time, hundreds of people have spent an enormous amount of time
improving the SciPy library and the community surrounding this library has
dramatically grown. I stopped being able to participate actively in
developing
the SciPy library around 2010.  Fortunately, at that time, Pauli Virtanen
and
Ralf Gommers picked up the pace of development sup

Re: [Numpy-discussion] NumPy 1.14 branch.

2017-11-06 Thread Ralf Gommers
On Sat, Nov 4, 2017 at 3:56 PM, Charles R Harris 
wrote:

> Hi All,
>
> I'd like to branch NumPy 1.14 soon.
>

Sounds good.

Before doing so, I'd like to make sure at a minimum that
>
> 1) Changes in array print formatting are done.
> 2) Proposed deprecations have been make.
>
> If there are other things that folks see as essential, now is the time to
> speak up.
>

Are we good on the pytest status? I see
https://github.com/numpy/numpy/pull/9386 is still open.

Ralf


> Chuckn
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Proposal of timeline for dropping Python 2.7 support

2017-11-06 Thread Ralf Gommers
On Mon, Nov 6, 2017 at 7:25 AM, Charles R Harris 
wrote:

> Hi All,
>
> Thought I'd toss this out there. I'm tending towards better sooner than
> later in dropping Python 2.7 support as we are starting to run up against
> places where we would like to use Python 3 features. That is particularly
> true on Windows where the 2.7 compiler is really old and lacks C99
> compatibility.
>

This is probably the most pressing reason to drop 2.7 support. We seem to
be expending a lot of effort lately on this stuff. I was previously
advocating being more conservative than the timeline you now propose, but
this is the pain point that I think gets me over the line.

In any case, the timeline I've been playing with is to keep Python 2.7
> support through 2018, which given our current pace, would be for NumPy 1.15
> and 1.16. After that 1.16 would become a long term support release with
> backports of critical bug fixes up until the time that Python 2.7 support
> officially ends. In that timeline, NumPy 1.17 would drop support for 2.7.
>

And 3.4 at the same time or even earlier.

That proposed schedule is subject to change pending developments and feed
> back.
>

+1


> The main task I think is needed before dropping 2.7 is better handling of
> unicode strings and bytes. There is the #4208
>  PR that makes a start on that.
>

Yep, at the very least we need one release that supports 2.7 *and* has
fixed all the IO issues on 3.x

Ralf


If there are other things that folks think are essential, please mention
> them here. If nothing else, we can begin planning for the transition even
> if the schedule changes.
>
> Chuck
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@python.org
> https://mail.python.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Proposal of timeline for dropping Python 2.7 support

2017-11-08 Thread Ralf Gommers
On Thu, Nov 9, 2017 at 12:15 PM, Nathaniel Smith  wrote:

> On Nov 8, 2017 16:51, "Matthew Brett"  wrote:
>
> Hi,
>
> On Wed, Nov 8, 2017 at 7:08 PM, Julian Taylor
>  wrote:
> > On 06.11.2017 11:10, Ralf Gommers wrote:
> >>
> >>
> >> On Mon, Nov 6, 2017 at 7:25 AM, Charles R Harris
> >> mailto:charlesr.har...@gmail.com>> wrote:
> >>
> >> Hi All,
> >>
> >> Thought I'd toss this out there. I'm tending towards better sooner
> >> than later in dropping Python 2.7 support as we are starting to run
> >> up against places where we would like to use Python 3 features. That
> >> is particularly true on Windows where the 2.7 compiler is really old
> >> and lacks C99 compatibility.
> >>
> >>
> >> This is probably the most pressing reason to drop 2.7 support. We seem
> >> to be expending a lot of effort lately on this stuff. I was previously
> >> advocating being more conservative than the timeline you now propose,
> >> but this is the pain point that I think gets me over the line.
> >
> >
> > Would dropping python2 support for windows earlier than the other
> > platforms a reasonable approach?
> > I am not a big fan of to dropping python2 support before 2020, but I
> > have no issue with dropping python2 support on windows earlier as it is
> > our largest pain point.
>
> I wonder about this too.  I can imagine there are a reasonable number
> of people using older Linux distributions on which they cannot upgrade
> to a recent Python 3,
>
>
> My impression is that this is increasingly rare, actually. I believe RHEL
> is still shipping 2.6 by default, which we've already dropped support for,
> and if you want RH python then they provide supported 2.7 and 3.latest
> through exactly the same channels. Ubuntu 14.04 is end-of-life in April
> 2019, so pretty irrelevant if we're talking about 2019 for dropping
> support, and 16.04 ships with 3.5. Plus with docker, conda, PPAs, etc.,
> getting a recent python is easier than its ever been.
>
> > but
>
> is that likely to be true for Windows?
>
> We'd have to make sure we could persuade pypi to give the older
> version for Windows, by default - I don't know if that is possible.
>
>
> Currently it's not – if pip doesn't see a Windows wheel, it'll try
> downloading and building an sdist. There's a mechanism for sdists to
> declare what version of python they support but (thanks to the jupyter
> folks for implementing this), but that's all. The effect is that if we
> release a version that drops support for py2 entirely, then 'pip install'
> on py2 will continue to work and give the last supported version, but if we
> release a version that drops py2 on Windows but keeps it on other platforms
> then 'pip install' on py2 on Windows will just stop working entirely.
>
> This is possible to fix – it's just software – but I'm not volunteering...
>

Given the release cycle of pip (slow) and the bandwidth required to
implement this, I think that this is likely a showstopper for
Windows-only-3.x-only.

Another consideration is that choices made by numpy tend to propagate to
the rest of the ecosystem, and support for Python versions that's
OS-independent is nicer than Windows special-casing.

And yet another is that when we do finally drop 2.7, I think we'd want to
get the full benefits of doing so. That's new 3.x features (@ in
particular), cleaning up lots of support code, etc.

For those reasons I think we should balance the pain and benefits of 2.7
support and just pick a date to drop it completely, not just on Windows.

Regarding http://www.python3statement.org/: I'd say that as long as there
are people who want to spend their energy on the LTS release (contributors
*and* enough maintainer power to review/merge/release), we should not
actively prevent them from doing that.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


  1   2   3   4   5   6   7   8   >