Re: [Python-Dev] PEP 539 v3: A new C API for Thread-Local Storage in CPython

2017-09-09 Thread Erik Bray
On Fri, Sep 8, 2017 at 4:37 PM, Nick Coghlan  wrote:
> On 8 September 2017 at 00:30, Masayuki YAMAMOTO
>  wrote:
>> Hi folks,
>>
>> I submit PEP 539 third draft for the finish. Thank you for all the advice
>> and the help!
>
> Thank you Erik & Yamamoto-san for all of your work on this PEP!
>
> The updates look good, so I'm happy to say as BDFL-Delegate that this
> proposal is now accepted :)

Thanks Nick!  It's great to have this issue resolved in a
carefully-considered, well-documented manner.

Best,
Erik
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Clarifying Cygwin support in CPython

2017-11-08 Thread Erik Bray
Hi folks,

As some people here know I've been working off and on for a while to
improve CPython's support of Cygwin.  I'm motivated in part by a need
to have software working on Python 3.x on Cygwin for the foreseeable
future, preferably with minimal graft.  (As an incidental side-effect
Python's test suite--especially of system-level functionality--serves
as an interesting test suite for Cygwin itself too.)

This is partly what motivated PEP 539 [1], although that PEP had the
advantage of benefiting other POSIX-compatible platforms as well (and
in fact was fixing an aspect of CPython that made it unfriendly to
supporting other platforms).

As far as I can tell, the first commit to Python to add any kind of
support for Cygwin was made by Guido (committing a contributed patch)
back in 1999 [2].  Since then, bits and pieces have been added for
Cygwin's benefit over time, with varying degrees of impact in terms of
#ifdefs and the like (for the most part Cygwin does not require *much*
in the way of special support, but it does have some differences from
a "normal" POSIX-compliant platform, such as the possibility for
case-insensitive filesystems and executables that end in .exe).  I
don't know whether it's ever been "officially supported" but someone
with a longer memory of the project can comment on that.  I'm not sure
if it was discussed at all or not in the context of PEP 11.

I have personally put in a fair amount of effort already in either
fixing issues on Cygwin (many of these issues also impact MinGW), or
more often than not fixing issues in the CPython test suite on
Cygwin--these are mostly tests that are broken due to invalid
assumptions about the platform (for example, that there is always a
"root" user with uid=0; this is not the case on Cygwin).  In other
cases some tests need to be skipped or worked around due to
platform-specific bugs, and Cygwin is hardly the only case of this in
the test suite.

I also have an experimental AppVeyor configuration for running the
tests on Cygwin [3], as well as an experimental buildbot (not
available on the internet, but working).  These currently rely on a
custom branch that includes fixes needed for the test suite to run to
completion without crashing or hanging (e.g.
https://bugs.python.org/issue31885).  It would be nice to add this as
an official buildbot, but I'm not sure if it makes sense to do that
until it's "green", or at least not crashing.  I have several other
patches to the tests toward this goal, and am currently down to ~22
tests failing.

Before I do any more work on this, however, it would be best to once
and for all clarify the support for Cygwin in CPython, as it has never
been "officially supported" nor unsupported--this way we can avoid
having this discussion every time a patch related to Cygwin comes up.
I could provide some arguments for why I believe Cygwin should
supported, but before this gets too long I'd just like to float the
idea of having the discussion in the first place.  It's also not
exactly clear to me how to meet the standards in PEP 11 for supporting
a platform--in particular it's not clear when a buildbot is considered
"stable", or how to achieve that without getting necessary fixes
merged into the main branch in the first place.

Thanks,
Erik



[1] https://www.python.org/dev/peps/pep-0539/
[2] 
https://github.com/python/cpython/commit/717d1fdf2acbef5e6b47d9b4dcf48ef1829be685
[3] https://ci.appveyor.com/project/embray/cpython
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Clarifying Cygwin support in CPython

2017-11-09 Thread Erik Bray
On Wed, Nov 8, 2017 at 5:28 PM, Zachary Ware
 wrote:
> On Wed, Nov 8, 2017 at 8:39 AM, Erik Bray  wrote:
>> a platform--in particular it's not clear when a buildbot is considered
>> "stable", or how to achieve that without getting necessary fixes
>> merged into the main branch in the first place.
>
> I think in this context, "stable" just means "keeps a connection to
> the buildbot master and doesn't blow up when told to build" :).  As
> such, I'm ready to get you added to the fleet whenever you are.

"Doesn't blow up when told to build" is the tricky part, because there
are a few tests that are known to cause the test suite process to hang
until killed.  It's not clear to me whether, even with the --timeout
option, that the test runner will kill hanging processes (I haven't
actually tried this though so I'll double-check, but I'm pretty sure
it does not).

So until at least those issues are resolved I'd be hesitate to call it "stable".

Thanks,
Erik
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Question on a seemingly useless doctest

2017-12-19 Thread Erik Bray
Hi all,

I have a ticket [1] that's hung up on a failure in one doctest in the
form of sage.doctest.sources.FileDocTestSource._test_enough_doctests.

This test has been there since, it seems, as long as the current
doctest framework has been in place and nobody seems to have
questioned it.  Its expected output is generated from the Sage sources
themselves, and can change when tests are added or removed to any
module (if any of those tests should be "skipped").  Over the years
the expected output to this test has just been updated as necessary.

But in taking a closer look at the test--and I could be mistaken--but
it's not even a useful test.  It's *attempting* to validate that the
doctest parser skips tests when it's supposed to.  But it performs
this validation by...implementing its own, less robust doctest parser,
and comparing the results of that to the results of the real doctest
parser.  Sometimes--in fact often--the comparison is wrong (as the
test itself acknowledges).

This doesn't seem to me a correct or useful way to validate the
doctest parser.  If there are cases that the real doctest parser
should be tested against, then unit tests/regression tests should be
written that simply test the real doctest parser against those cases
and check the results.  Having essentially a real doctest parser, and
a "fake" one that's incorrect doesn't make sense to me, unless there's
something about this I'm misunderstanding.

I would propose to just remove the test.  If there are any actual
regressions it's responsible for catching then more focused regression
tests should be written for those cases.

Erik


[1] https://trac.sagemath.org/ticket/24261#comment:24
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Question on a seemingly useless doctest

2017-12-19 Thread Erik Bray
Sorry, completely fat-fingered my autocomplete and sent to to wrong list.

On Tue, Dec 19, 2017 at 12:12 PM, Erik Bray  wrote:
> Hi all,
>
> I have a ticket [1] that's hung up on a failure in one doctest in the
> form of sage.doctest.sources.FileDocTestSource._test_enough_doctests.
>
> This test has been there since, it seems, as long as the current
> doctest framework has been in place and nobody seems to have
> questioned it.  Its expected output is generated from the Sage sources
> themselves, and can change when tests are added or removed to any
> module (if any of those tests should be "skipped").  Over the years
> the expected output to this test has just been updated as necessary.
>
> But in taking a closer look at the test--and I could be mistaken--but
> it's not even a useful test.  It's *attempting* to validate that the
> doctest parser skips tests when it's supposed to.  But it performs
> this validation by...implementing its own, less robust doctest parser,
> and comparing the results of that to the results of the real doctest
> parser.  Sometimes--in fact often--the comparison is wrong (as the
> test itself acknowledges).
>
> This doesn't seem to me a correct or useful way to validate the
> doctest parser.  If there are cases that the real doctest parser
> should be tested against, then unit tests/regression tests should be
> written that simply test the real doctest parser against those cases
> and check the results.  Having essentially a real doctest parser, and
> a "fake" one that's incorrect doesn't make sense to me, unless there's
> something about this I'm misunderstanding.
>
> I would propose to just remove the test.  If there are any actual
> regressions it's responsible for catching then more focused regression
> tests should be written for those cases.
>
> Erik
>
>
> [1] https://trac.sagemath.org/ticket/24261#comment:24
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Heap allocate type structs in native extension modules?

2017-12-28 Thread Erik Bray
On Tue, Dec 26, 2017 at 3:00 PM, Benjamin Peterson  wrote:
> I imagine Cython already takes care of this?

This appears to have a distinct purpose, albeit not unrelated to
Cython.  The OP's program would generate boilerplate C code for
extension types the rest of which would perhaps be implemented by hand
in C.  Cython does this as well to an extent, but the generated code
contains quite a bit of Cython-specific cruft and is not really meant
to be edited by hand or read by humans in most cases.

Anyways I don't think this answers the OP's question.

> On Tue, Dec 26, 2017, at 02:16, Hugh Fisher wrote:
>> I have a Python program which generates the boilerplate code for
>> native extension modules from a Python source definition.
>> (http://bitbucket.org/hugh_fisher/fullofeels if interested.)
>>
>> The examples in the Python doco and the "Python Essential Reference"
>> book all use a statically declared PyTypeObject struct and
>> PyType_Ready in the module init func, so I'm doing the same. Then
>> Python 3.5 added a check for statically allocated types inheriting
>> from heap types, which broke a couple of my classes. And now I'm
>> trying to add a __dict__ to native classes so end users can add their
>> own attributes, and this is turning out to be painful with static
>> PyTypeObject structs
>>
>> Would it be better to use dynamically allocated type structs in native
>> modules?
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Translating sample programs in documentation

2018-04-16 Thread Erik Bray
On Mon, Apr 16, 2018 at 4:49 AM, Shell Xu  wrote:
> Well, I'm not sure weather or not this is what you're looking for, but pep-8
> (https://www.python.org/dev/peps/pep-0008/) suggest like this:
>
> For Python 3.0 and beyond, the following policy is prescribed for the
> standard library (see PEP 3131): All identifiers in the Python standard
> library MUST use ASCII-only identifiers, and SHOULD use English words
> wherever feasible (in many cases, abbreviations and technical terms are used
> which aren't English). In addition, string literals and comments must also
> be in ASCII. The only exceptions are (a) test cases testing the non-ASCII
> features, and (b) names of authors. Authors whose names are not based on the
> Latin alphabet (latin-1, ISO/IEC 8859-1 character set) MUST provide a
> transliteration of their names in this character set.
>
> So, I guess translate symbols to Chinese are not gonna help reader to figure
> out what kind of code should they writing...


That only applies to the Python stdlib itself.  It's a feature that
Python allows unicode identifiers, and there's nothing about that
against PEP-8 for sure.

I think it's a great idea; I'm not sure how it works out technically
in terms of providing .po files for .rst documentation or if there's
some better mechanism for that...

Best,
E


> On Mon, Apr 16, 2018 at 12:41 AM, Xuan Wu
>  wrote:
>>
>> Excuse me if this was discussed before, but in French and Japanese
>> translations, all the sample programs seem to have identifiers in English
>> still. According to "PEP 545 -- Python Documentation Translations", as I
>> understand .po files are used for translations. May I ask if there's
>> technical restrictions causing translations being only applied to the text
>> parts?
>>
>> For example, here's the first sample program in 4.2:
>>
>> >>> # Measure some strings:
>> ... words = ['cat', 'window', 'defenestrate']
>> >>> for w in words:
>> ... print(w, len(w))
>> ...
>> cat 3
>> window 6
>> defenestrate 12
>>
>> Here's a possible translation in Chinese:
>>
>> >>> # 丈量一些字符串
>> ... 词表 = ['猫', '窗户', '丢出窗户']
>> >>> for 词 in 词表:
>> ... print(词, len(词))
>> ...
>> 猫 1
>> 窗户 2
>> 丢出窗户 4
>>
>> As you may notice the strings differ in size if they are translated
>> directly. Obviously that does add extra burden to review the new sample
>> programs to assure effectiveness and readability.
>> Any suggestion or comments are welcome.
>>
>>
>> Thanks,
>> Xuan.
>>
>> ___
>> Python-Dev mailing list
>> Python-Dev@python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>> https://mail.python.org/mailman/options/python-dev/shell909090%40gmail.com
>>
>
>
>
> --
> 彼節者有間,而刀刃者無厚;以無厚入有間,恢恢乎其於游刃必有餘地矣。
> blog: http://shell909090.org/
> twitter: @shell909090
> about.me: http://about.me/shell909090
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/erik.m.bray%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 493: Redistributor guidance for Python 2.7 HTTPS

2015-07-06 Thread Erik Bray
On Mon, Jul 6, 2015 at 6:21 AM, Antoine Pitrou  wrote:
> On Mon, 6 Jul 2015 14:22:46 +1000
> Nick Coghlan  wrote:
>>
>> The main change from the last version discussed on python-ideas
>
> Was it discussed there? That list has become totally useless, I've
> stopped following it.

Considering that a useful discussion of a useful PEP occurred there
(not to mention other occasionally useful discussions) I'd say that
such a value judgment is not only unnecessary but also inaccurate.
That's fine if it's uninteresting to you and you don't want to follow
it, but let's please avoid judgments on entire mailing lists and, by
extension, the people holding conversations there.

Thanks,
Erik
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] How far to go with user-friendliness

2015-07-20 Thread Erik Bray
On Tue, Jul 14, 2015 at 6:22 PM, Robert Collins
 wrote:
> For clarity, I think we should:
>  - remove the assret check, it is I think spurious.
>  - add a set of functions to the mock module that should be used in
> preference to Mock.assert*
>  - mark the Mock.assert* functions as PendingDeprecation
>  - in 3.6 move the PendingDeprecation to Deprecated
>  - in 3.7 remove the Mock.assert* functions and the check for method
> names beginning with assert entirely.

Hi all,

I'm just an onlooker, and haven't read every word of this thread.  In
fact I worry that it's pointless to reply to rather than starting a
new thread.  I just wanted to make sure that the specific message I'm
replying too wasn't lost in the noise because I think Robert's
suggestion makes vastly more sense than anything else I've seen here
(I came searching through the thread to see if anyone else suggested
this before I started a thread to do so).

I don't think it makes any sense to have magic assert_ methods on the
Mock object.  Not only does the "magic" clearly lead to too many
ambiguities and other problems--I think they make less sense from an
API standpoint in the first place.  Typically asserting something in a
test is not something an object *does*--a method.  More often we as a
test writers assert something *about* an object.  The assertion is an
external tool meant to measure and introspect things about the system
under observation.  In this case, although Mock is itself a testing
tool, we're introspecting something about the Mock object as external
observers.

***Assertions on Mock objects should be implemented as stand-alone
functions (that happen to be used primarily on Mock objects as
input).***

Aside from, in my mind, making more sense philosophically, using
specialized assert functions for this has absolutely none of the
spelling ambiguities or other problems of the magic methods.

I'm -0 on removing the assret_ methods unless no one is using them
yet.  I don't care if they're there as long as they're deprecated
along with the other magic methods of Mock.

Best,
Erik
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] VS 2010 compiler

2015-09-28 Thread Erik Bray
On Fri, Sep 25, 2015 at 6:27 PM, Chris Barker - NOAA Federal
 wrote:
>
> You can use "Windows SDK for Windows 7 and .NET Framework 4".
>
> http://www.microsoft.com/en-us/download/details.aspx?id=8279
>
>
> Thanks. Last time I tried that route, it was for 64 bit py2.7. And it
> required some kludging of environment variables, and registry acces I don't
> have permission for.
>
> But it still may be the best bet. I'll give it a try when I have a chance.
>
> And this should be in the "official" docs...

For what it's worth, I've had good luck compiling *most* extension
modules on Windows using the gcc from MinGW-w64.

The big notable exception was that last time I tried compiling Numpy
with it I got a segfaulting Numpy.  But I never had a chance to
investigate why or if it's fixable.  My own extension modules work
fine on Windows when compiled in MinGW though.

Erik B.


> On Sat, Sep 26, 2015 at 12:24 AM, Chris Barker - NOAA Federal
>  wrote:
>>
>> As I understand it, the MS VS2010 compiler is required (or at least
>> best practice) for compiling Python extensions for the python.org
>> Windows builds of py 3.4 and ?[1]
>>
>> However, MS now makes it very hard (impossible?) to download VS2010
>> Express ( or Community, or whatever the free as in beer version is
>> called).
>>
>> I realize that this is not python-dev's responsibility, but if there
>> is any way to either document where it can be found, or put a bit of
>> pressure on MS to make it available, as they have for VS2008 and
>> py2.7, that would be great.
>>
>> Sorry to bug this list, I didn't know where else to reach out to.
>>
>> -Chris
>>
>> [1] it's actually prefer hard to find out which compiler version is
>> used for which python version. And has been for years. Would a patch
>> to the docs, probably here:
>>
>> https://docs.python.org/3.4/using/windows.html#compiling-python-on-windows
>>
>> Be considered?
>> ___
>> Python-Dev mailing list
>> Python-Dev@python.org
>> https://mail.python.org/mailman/listinfo/python-dev
>> Unsubscribe:
>> https://mail.python.org/mailman/options/python-dev/songofacandy%40gmail.com
>
>
>
>
> --
> INADA Naoki  
>
>
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/erik.m.bray%40gmail.com
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Issue with DLL import library installation in Cygwin

2016-04-22 Thread Erik Bray
Hi all,

I've been working on compiling/installing Python on Cygwin and have
hit upon an odd issue in the Makefile that seems to have been around
for as long as there's been Cygwin support in it.

When building Python on Cygwin, both a libpython-X.Y.dll and a
libpython-X.Y.dll.a are created.  The latter is an "import library"
consisting of stubs for functions in the DLL so that it can be linked
to statically when building, for example, extension modules.

The odd bit is that in the altbininstall target (see [1]) if the
$(DLLLIBRARY) variable is defined then only it is installed, while
$(LDLIBRARY) (which in this cases references the import library) is
*not* installed, except in $(prefix)/lib/pythonX.Y/config, which is
not normally on the linker search path, or even included by
python-config --ldflags.  Therefore static linking to libpython fails,
unless the search path is explicitly modified, or a symlink is created
from $(prefix)/lib/pythonX.Y/config/libpython.dll.a to $(prefix)/lib.

In fact Cygwin's own package for Python manually creates the latter
symlink in its install script.  But it's not clear why Python's
Makefile doesn't install this file in the first place.  In other
words, why not install $LDLIBRARY regardless?

Thanks,
Erik


[1] https://hg.python.org/cpython/file/496e094f4734/Makefile.pre.in#l1097
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Proposed: The Great Argument Clinic Conversion Derby

2014-01-06 Thread Erik Bray
On Sun, Jan 5, 2014 at 11:21 AM, Larry Hastings  wrote:
> Now, properly converting a function to work with Argument Clinic does not
> change its behavior.  Internally, the code performing argument parsing
> should be nigh-identical; it should call the same PyArg_Parse function, with
> the same arguments, and the implementation should perform the same work as a
> result.  The only externally observable change should be that
> inspect.signature() now produces a valid signature for the builtin; in all
> other respects Python should be unchanged.  No documentation should have to
> change, no tests should need to be modified, and absolutely no code should
> be broken as a result.  Converting a function to use Argument Clinic should
> be a blissfully low-risk procedure, and produce a pleasant,
> easier-to-maintain result.

Hi,

If it goes forward I would be willing to help out with the derby on a
few modules.  I haven't followed the Argument Clinic arguments closely
before now, so I don't know if this question has been addressed.  I
didn't see it mentioned in the docs anywhere, but will the policy be
to *prefer* renaming existing functions to the names generated by
clinic (the "_impl" names) or to override that to keep the existing
names?

I ask because some built-in functions are used internally by other
built-in functions.  I don't know how common this is but, for example,
fileio_read calls fileio_readall.  So if fileio_readall is renamed to
io_FileIO_readall_impl or whatever we need to also go through and fix
any references to fileio_readall.  Should be easy enough, but I wonder
if there are any broader side-effects of this.  Might it be safer for
the first round to keep the existing function names?

Erik
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Update on Cygwin support (was: Clarifying Cygwin support in CPython)

2018-07-25 Thread Erik Bray
On Wed, Nov 8, 2017 at 3:39 PM Erik Bray  wrote:
>
> Hi folks,
>
> As some people here know I've been working off and on for a while to
> improve CPython's support of Cygwin.  I'm motivated in part by a need
> to have software working on Python 3.x on Cygwin for the foreseeable
> future, preferably with minimal graft.  (As an incidental side-effect
> Python's test suite--especially of system-level functionality--serves
> as an interesting test suite for Cygwin itself too.)
>
> This is partly what motivated PEP 539 [1], although that PEP had the
> advantage of benefiting other POSIX-compatible platforms as well (and
> in fact was fixing an aspect of CPython that made it unfriendly to
> supporting other platforms).
>
> As far as I can tell, the first commit to Python to add any kind of
> support for Cygwin was made by Guido (committing a contributed patch)
> back in 1999 [2].  Since then, bits and pieces have been added for
> Cygwin's benefit over time, with varying degrees of impact in terms of
> #ifdefs and the like (for the most part Cygwin does not require *much*
> in the way of special support, but it does have some differences from
> a "normal" POSIX-compliant platform, such as the possibility for
> case-insensitive filesystems and executables that end in .exe).  I
> don't know whether it's ever been "officially supported" but someone
> with a longer memory of the project can comment on that.  I'm not sure
> if it was discussed at all or not in the context of PEP 11.
>
> I have personally put in a fair amount of effort already in either
> fixing issues on Cygwin (many of these issues also impact MinGW), or
> more often than not fixing issues in the CPython test suite on
> Cygwin--these are mostly tests that are broken due to invalid
> assumptions about the platform (for example, that there is always a
> "root" user with uid=0; this is not the case on Cygwin).  In other
> cases some tests need to be skipped or worked around due to
> platform-specific bugs, and Cygwin is hardly the only case of this in
> the test suite.
>
> I also have an experimental AppVeyor configuration for running the
> tests on Cygwin [3], as well as an experimental buildbot (not
> available on the internet, but working).  These currently rely on a
> custom branch that includes fixes needed for the test suite to run to
> completion without crashing or hanging (e.g.
> https://bugs.python.org/issue31885).  It would be nice to add this as
> an official buildbot, but I'm not sure if it makes sense to do that
> until it's "green", or at least not crashing.  I have several other
> patches to the tests toward this goal, and am currently down to ~22
> tests failing.
>
> Before I do any more work on this, however, it would be best to once
> and for all clarify the support for Cygwin in CPython, as it has never
> been "officially supported" nor unsupported--this way we can avoid
> having this discussion every time a patch related to Cygwin comes up.
> I could provide some arguments for why I believe Cygwin should
> supported, but before this gets too long I'd just like to float the
> idea of having the discussion in the first place.  It's also not
> exactly clear to me how to meet the standards in PEP 11 for supporting
> a platform--in particular it's not clear when a buildbot is considered
> "stable", or how to achieve that without getting necessary fixes
> merged into the main branch in the first place.
>
> Thanks,
> Erik
>
>
>
> [1] https://www.python.org/dev/peps/pep-0539/
> [2] 
> https://github.com/python/cpython/commit/717d1fdf2acbef5e6b47d9b4dcf48ef1829be685
> [3] https://ci.appveyor.com/project/embray/cpython

Apologies for responding to a months old post, but rather than repeat
myself verbatim I'll just mention that all of the above is still true
and relevant, and I am still interested in getting Python somewhere
closer to "stable" on Cygwin.

Part of the problem with my previous approach is that I was trying to
fix every last test failure before asking to add Cygwin to CPython's
CI fleet.  While I believe all failures *should* be fixed (or skipped
as appropriate) this is not practical to do in a short amount of time,
and not having CI implemented for a platform means new bugs are added
faster than we can fix the existing bugs.  For example, between 3.6
and 3.7 two new bugs have caused Python to be unbuildable on Cygwin:

https://bugs.python.org/issue34211
https://bugs.python.org/issue34212

This is in addition to an older issue that I was hoping to have fixed
in Python 3.7, as the PR was "green" for some time well before its
release:

https://github.com/python/cpython/pull

Re: [Python-Dev] Benchmarks why we need PEP 576/579/580

2018-07-25 Thread Erik Bray
On Sat, Jul 21, 2018 at 6:30 PM Jeroen Demeyer  wrote:
>
> Hello,
>
> I finally managed to get some real-life benchmarks for why we need a
> faster C calling protocol (see PEPs 576, 579, 580).
>
> I focused on the Cython compilation of SageMath. By default, a function
> in Cython is an instance of builtin_function_or_method (analogously,
> method_descriptor for a method), which has special optimizations in the
> CPython interpreter. But the option "binding=True" changes those to a
> custom class which is NOT optimized.
>
> I ran the full SageMath testsuite several times without and with
> binding=True to find out any significant differences. The most dramatic
> difference is multiplication for generic matrices. More precisely, with
> the following command:
>
> python -m timeit -s "from sage.all import MatrixSpace, GF; M =
> MatrixSpace(GF(9), 200).random_element()" "M * M"
>
> With binding=False, I got
> 10 loops, best of 3: 692 msec per loop
>
> With binding=True, I got
> 10 loops, best of 3: 1.16 sec per loop
>
> This is a big regression which should be gone completely with PEP 580.
>
> I should mention that this was done on Python 2.7.15 (SageMath is not
> yet ported to Python 3) but I see no reason why the conclusions
> shouldn't be valid for newer Python versions. I used SageMath 8.3.rc1
> and Cython 0.28.4.

I haven't fully caught up on the thread yet so this might already be a
moot point.  But just in case it isn't, the Python 3 port of Sage
works well enough (at least on my branch) that the above benchmark
works, and would probably be worth repeating there (it's currently
Python 3.6.1, but upgrading to 3.7 probably wouldn't break the example
benchmark either).
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Update on Cygwin support (was: Clarifying Cygwin support in CPython)

2018-07-31 Thread Erik Bray
On Mon, Jul 30, 2018 at 5:26 PM Nick Coghlan  wrote:
>
> On 26 July 2018 at 02:13, Erik Bray  wrote:
> > I think a new approach that might be more practical for actually
> > getting this platform re-supported, is to go ahead and add a CI build,
> > and just skip all known failing test modules.  This is what I've done
> > in a new PR to add a Cygwin build on AppVeyor:
> >
> > https://github.com/python/cpython/pull/8463
> >
> > This is far from ideal of course, and should not mean the platform is
> > "supported".  But now I and others can go through and fix the
> > remaining test failures, re-enable those modules in the CI
> > configuration, and actually obtain some meaningful results, which will
> > hopefully encourage the core committers to accept fixes for the
> > platform.
>
> I believe the PEP 538 & 540 locale handling tests are amongst those
> that are still a bit sketchy (or outright broken?) on Cygwin, and I
> think having an advisory CI bot would definitely help with that.
> (Cygwin/MinGW are an interesting hybrid that really highlight the fact
> that neither "POSIX implies not Windows" nor "Windows implies the
> Win32 API" are entirely valid assumptions)

Yes, I believe those tests are still a little broken, though the
improvements you last made to them should be helpful in getting it
passing.  I haven't looked at it in a few months.

Indeed, it makes for some interesting broken assumptions.  Another
example I've encountered recently is because Cygwin uses the posixpath
module, all handling of Windows-style paths is broken.  This is fine,
because in general a developer should *not* be using Windows paths on
Cygwin; POSIX paths only.  However, the fact remains that Cygwin does
(mostly) transparently support Windows paths at the system level, so
some things work.  But if a user runs a script that happens to be
written in Python, but passes Windows paths to it, say, as
command-line arguments, it may or may not work.  If the path is passed
directly to open(), no problem.  But if it goes through
os.path.abspath for example things blow up.

I'm undecided as to whether this is something that developers writing
applications that support Cygwin need to handle, or if this is
something that could work better on the Python end as well.  I lean
toward the former, but I also wonder if there isn't more that could be
done in the stdlib to improve this issue as well.  In the meantime I
wrote pycygwin [1] to help with these sorts of issues in my own
software.


> So your suggested approach seems like a plausible way forward to me.
>
> The main potentially viable alternative I see would be to set up the
> *buildbot* first, and then devote the custom builder branch to the
> task of Cygwin testing for a while:
> https://devguide.python.org/buildbots/#custom-builders
>
> However, I think the overall UX of that would be worse than going down
> the advisory CI path (especially since it wouldn't really help with
> the aspect of parallel development introducing new Cygwin failures).

Exactly.  And at least for starters I might have to push to
buildbot-custom, because without a few minimal fixes in place CPython
currently does not build successfully at all on Cygwin, which makes
the buildbot a little unhelpful.

But Zach is already in touch with me about getting a buildbot worker
set up anyways.  I agree it's still good to have, and will be more and
more useful as I get the requisite fixes merged...

Thanks,
E



[1] http://pycygwin.readthedocs.io/en/latest/
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A Subtle Bug in Class Initializations

2018-08-09 Thread Erik Bray
On Mon, Aug 6, 2018 at 8:11 PM Eddie Elizondo  wrote:
>
> Background:
>
> Through the implementation of an alternate runtime I've been poking around 
> some of the class initialization routines and I found out that there was a 
> subtle bug with PyType_Ready and the header initializer PyVarObject_HEAD_INIT.
>
>
>
> Looking through the codebase, I couldn't really find any pattern of when the 
> type should be defined within PyVarObject_HEAD_INIT. Sometimes it was 
> initialized to NULL (or 0) and PyType_Type (let's ignore Py_True and Py_False 
> from now).
>
>
>
> From PyType_Ready it turns out that setting the value PyType_Type is never 
> actually needed outside of PyType_Type and PyBaseObject type. This is clear 
> from the code:
>
> if (Py_TYPE(type) == NULL && base != NULL)
>
> Py_TYPE(type) = Py_TYPE(base);
>
>
>
> Given that any PyTypeObject's base is of type PyType_Type, setting 
> PyVarObject_HEAD_INIT(&PyType_Ready) is superfluous. Therefore, setting all 
> static PyTypeObjects to their ob_type to NULL should be a safe assumption to 
> make.
>
>
>
> Uninitialized Types:
>
> A quick s/PyVarObject_HEAD_INIT(&PyType_Type/PyVarObject_HEAD_INIT(NULL/  
> shows that some objects do need to have their ob_type set from the outset, 
> violating the previous assumption. After writing a quick script, I found that 
> out of the ~300 PyVarObject_HEAD_INIT present in CPython, only these 14 types 
> segfaulted:
>
>
>
> PyByteArrayIter_Type
>
> PyBytesIter_Type
>
> PyDictIterKey_Type
>
> PyDictIterValue_Type
>
> PyDictIterItem_Type
>
> PyClassMethod_Type
>
> PyAsyncGen_Type
>
> PyListIter_Type
>
> PyListRevIter_Type
>
> PyODictIter_Type
>
> PyLongRangeIter_Type
>
> PySetIter_Type
>
> PyTupleIter_Type
>
> PyUnicodeIter_Type
>
>
>
>
>
> Bug:
>
> It turns out that these PyTypeObjects are never initialized through 
> PyType_Ready. However, they are used as fully initialized types. It is by 
> pure chance that the work without having to call the initializer on them. 
> This though is undefined behavior. This not only might result in a weird 
> future bug which is hard to chase down but also, it affects other runtimes as 
> this behavior depends on implementation details of CPython.
>
>
>
> This is a pervasive pattern that should be removed from the codebase and 
> ideally extensions should follow as well.
>
>
>
> Solution:
>
> Here are my proposed solutions in order from less controversial to most 
> controversial. Note that all of them I already tried in a local branch and 
> are working:
>
>
>
> 1) Initialize all uninitialized types.
>
>
>
> Example: 
> https://github.com/eduardo-elizondo/cpython/commit/bc53db3cf4e5a6923b0b1afa6181305553faf173
>
> 2) Move all PyVarObject_HEAD_INIT to NULL except PyType_Type, 
> PyBaseObject_Type and the booleans.
>
>
>
> 3) Special case the initialization of PyType_Type and PyBaseObject_Type 
> within PyType_Ready to now make all calls to PyVarObject_HEAD_INIT use NULL. 
> To enable this a small change within PyType_Ready is needed to initialize 
> PyType_Type PyBaseObject:

Coincidentally, I just wrote a long-ish blog post explaining in
technical details why PyVarObject_HEAD_INIT(&PyType_Type) pretty much
cannot work, at least for extension modules (it is not a problem in
the core library), on Windows (my post was focused on Cygwin but it is
a problem for Windows in general):
http://iguananaut.net/blog/programming/windows-data-import.html

The TL;DR is that it's not possible on Windows to initialize a struct
with a pointer to some other data that's found in another DLL (i.e.
&PyType_Type), unless it happens to be a function, as a special case.
But &PyType_Type obviously is not, so thinks break.

So I'm +1 for requiring passing NULL to PyVarObject_HEAD_INIT,
requiring PyType_Ready with an explicit base type argument, and maybe
(eventually) making PyVarObject_HEAD_INIT argumentless.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A Subtle Bug in Class Initializations

2018-08-10 Thread Erik Bray
On Thu, Aug 9, 2018 at 7:21 PM Steve Dower  wrote:
>
> On 09Aug2018 0818, Erik Bray wrote:
> > On Mon, Aug 6, 2018 at 8:11 PM Eddie Elizondo  wrote:
> >> 3) Special case the initialization of PyType_Type and PyBaseObject_Type 
> >> within PyType_Ready to now make all calls to PyVarObject_HEAD_INIT use 
> >> NULL. To enable this a small change within PyType_Ready is needed to 
> >> initialize PyType_Type PyBaseObject:
> >
> > Coincidentally, I just wrote a long-ish blog post explaining in
> > technical details why PyVarObject_HEAD_INIT(&PyType_Type) pretty much
> > cannot work, at least for extension modules (it is not a problem in
> > the core library), on Windows (my post was focused on Cygwin but it is
> > a problem for Windows in general):
> > http://iguananaut.net/blog/programming/windows-data-import.html
> >
> > The TL;DR is that it's not possible on Windows to initialize a struct
> > with a pointer to some other data that's found in another DLL (i.e.
> > &PyType_Type), unless it happens to be a function, as a special case.
> > But &PyType_Type obviously is not, so thinks break.
>
> Great write-up! I think logically it should make sense that you cannot
> initialize a static value from a dynamically-linked library, but you've
> conclusively shown why that's the case. I'm not clear whether it's also
> the case on other OS's, but I don't see why it wouldn't be (unless they
> compile magic load-time resolution).

Thanks!  I'm not sure what you mean by "on other OS's" though.  Do you
mean other OS's that happen to use Windows-style PE/COFF binaries?
Because other than Windows I'm not sure what we care about there.

For ELF binaries, at least on Linux (and probably elsewhere) it the
runtime loader can perform more sophisticated relocations when loading
a binary into memory, including relocating pointers in the binary's
.data section.  This allows it to initialize data in one executable
"A" with pointers to data in another library "B" *before* "A" is
considered fully loaded and executable.

So this problem never arises, at least on Linux.

> > So I'm +1 for requiring passing NULL to PyVarObject_HEAD_INIT,
> > requiring PyType_Ready with an explicit base type argument, and maybe
> > (eventually) making PyVarObject_HEAD_INIT argumentless.
>
> Since PyVarObject_HEAD_INIT currently requires PyType_Ready() in
> extension modules already, then don't we just need to fix the built-in
> types?
>
> As far as the "eventually" case, I'd hope that eventually extension
> modules are all using PyType_FromSpec() :)

+1 :)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A Subtle Bug in Class Initializations

2018-08-13 Thread Erik Bray
On Fri, Aug 10, 2018 at 6:49 PM Steve Dower  wrote:
>
> On 10Aug2018 0354, Erik Bray wrote:
> > Thanks!  I'm not sure what you mean by "on other OS's" though.  Do you
> > mean other OS's that happen to use Windows-style PE/COFF binaries?
> > Because other than Windows I'm not sure what we care about there.
> >
> > For ELF binaries, at least on Linux (and probably elsewhere) it the
> > runtime loader can perform more sophisticated relocations when loading
> > a binary into memory, including relocating pointers in the binary's
> > .data section.  This allows it to initialize data in one executable
> > "A" with pointers to data in another library "B" *before* "A" is
> > considered fully loaded and executable.
> >
> > So this problem never arises, at least on Linux.
>
> That's exactly what I meant. I simply didn't know how/whether other
> loaders handled this case :) I recognise it's nothing to do with the
> binary format and everything to do with whether the loader knows what to
> do or not.

Ah, that's not exactly what *I* meant, but you are also right: In
principle it shouldn't have anything to do with the binary formation.
You could stuff a more sophisticated dynamic relocation section into a
PE/COFF binary too but Windows wouldn't know what to do with it.

So you're right that this kind of problem could affect other OSes, I
just have no idea if it does.

> >>> So I'm +1 for requiring passing NULL to PyVarObject_HEAD_INIT,
> >>> requiring PyType_Ready with an explicit base type argument, and maybe
> >>> (eventually) making PyVarObject_HEAD_INIT argumentless.
> >>
> >> Since PyVarObject_HEAD_INIT currently requires PyType_Ready() in
> >> extension modules already, then don't we just need to fix the built-in
> >> types?
> >>
> >> As far as the "eventually" case, I'd hope that eventually extension
> >> modules are all using PyType_FromSpec() :)
> >
> > +1 :)
>
> Is that just a +1 for PyType_FromSpec(), or are you agreeing that we
> only need to fix the built-in types?

Both! I think we should fix the built-in types, but it would be nice
if more extension modules used PyType_FromSpec, if nothing else
because I find it much more readable, and (I'm guessing) it's better
from the perspective of Victor's C-API work.  But I admit I'm not
fully versed in the downsides (if there are any).
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Workflow blocked on the 3.6 because of AppVeyor; who owns the AppVeyor project?

2018-09-05 Thread Erik Bray
On Wed, Sep 5, 2018 at 4:32 PM Paul Moore  wrote:
>
> On Wed, 5 Sep 2018 at 14:47, Zachary Ware  
> wrote:
> >
> > On Wed, Sep 5, 2018 at 6:23 AM Antoine Pitrou  wrote:
> > > On Wed, 5 Sep 2018 11:03:48 +0100
> > > Paul Moore  wrote:
> > > > On Wed, 5 Sep 2018 at 10:55, Victor Stinner  wrote:
> > > > > Who ows the "python" AppVeyor project?
> >
> > That seems to have fallen to me for the most part.
> >
> > > > > Can someone please give me the
> > > > > administrator permission on this project, so I will be able to invalid
> > > > > the build cache?
> > > >
> > > > I don't appear to have admin rights on Appveyor either.
> >
> > I've attempted to make a change that should give you both more access;
> > even odds on whether it did anything :).  I've never tried to use
> > their REST API, so I don't know whether it will help with that at all.
>
> I do indeed now seem to have admin access on Appveyor. Thanks for
> that. I guess I should therefore say that if anyone needs help with
> Appveyor stuff, feel free to ping me and save Zach from getting all
> the work :-)
>
> > > For some reason it seems to be located in a hidden directory
> > > (".github/appveyor.yml").  Not the most intuitive decision IMHO.
> > > Travis' own config file ".travis.yml" is still at repository root, which
> > > makes things more confusing.
> >
> > The idea there was to avoid proliferation of root-level dotfiles where
> > possible, but if we would rather keep it at the project root it's a
> > relatively simple change to make.
>
> When working via github on the web (which I was) rather than on a
> local checkout where I can search, putting it in a subdiretory is a
> bit less discoverable (made worse because there's nothing about the
> name ".github" that suggests it would have Appveyor files in it :-))
> I'd prefer it at the top level - but not enough to submit a PR for
> that at the moment, so I'm fine with it staying where it is.
>
> > For the actual issue at hand, the problem arises from doing builds on
> > 3.6 with both the VS2015 and VS2017 images.  Apparently something
> > built in `/externals` by the VS2015 build gets cached, which then
> > breaks the VS2017 build; I haven't tracked down how exactly that is
> > happening.  I think the preferred solution is probably to just drop
> > the VS2017 build on 3.6 on AppVeyor; VSTS runs on VS2017 and dropping
> > one of the builds from 3.6 will make AppVeyor significantly quicker on
> > that branch.
>
> Nice catch. I'd agree, it's probably not worth having both
> (particularly as, if Victor says, we have buildbots for the one
> Appveyor doesn't cover - but even if we don't I think VSTS has it
> covered).
>
> I presume you're suggesting keeping 2017 is so that we don't have
> stray 2015-built artifacts in the cache, which makes sense to me, and
> I have a mild preference for keeping the latest compiler, as that's
> likely the one that people will find easier to get. But 2015 is
> presumably the version the official 3.6 builds are made with, so
> there's an argument for keeping that one (although if we do that I
> guess we need to find a *different* way of fixing the cached artifact
> issue).
>
> tl; dr; I'm inclined to agree with you that just using VS2017 on
> Appveyor is the simplest option.

Hello,

Let me take this note as an opportunity to nag that I have a still
open pull request to add testing of Python on Cygwin to the AppVeyor
build, which in theory works quite well:
https://github.com/python/cpython/pull/8463
So +1 for dropping one build configuration from AppVeyor if that will
make it easier in the future to add this one :)

However, Victor has asked that as a prerequisite to adding a Cygwin
build to AppVeyor, we first have a relatively stable buildbot.  I had
thought maybe adding advisory CI on AppVeyor *first* would make
getting a stable buildbot easier, but I can see the argument either
way, so we have added said buildbot:
https://buildbot.python.org/all/#/builders/164   Unfortunately, for
the last ~120 builds it has been all but useless due to at least two
small, but long outstanding issues preventing 3.7.x from building on
Cygwin.  Both of those issues have proposed fixes pending review, both
of which have PRs linked to from my AppVeyor PR.

If anyone is interested in having a look at those I'd appreciate it,
thanks  (one of them also got some review from Inada Naoki, but we
didn't ever agree on some concrete action items for making the patch
acceptable, and it has stalled again...)

Best,
E
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] dear core-devs

2018-10-02 Thread Erik Bray
On Tue, Oct 2, 2018 at 3:53 AM Tres Seaver  wrote:
>
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> On 10/01/2018 06:41 PM, Michael Felt wrote:
>
> > And, while you may not give a damn about anything other than Windows,
> > macos and/or Linux - there are other platforms that would like a
> > stable Python.
>
> Michael,
>
> I can understand the frustration you feel:  you have been putting effort
> into a labor of love geting Python support on AIX (back?) into shape, and
> feel that your efforts are unappreciated, or worse, that they will be waste
> d.
>
> The key thing to realize about the various core developers (and the
> broader Python and open source communities) is that their attention is a
> heavily over-committed resource:  it isn't that folks here aren't
> benevolent toward your efforts, but rather that each of them (us?) makes
> decisions every day juggling which projects / tasks to give the minutes /
> hours we have available.  In the common case, the "triage" involves
> scrathing an itch:  this bug affects me / my work, that feature would
> make my life / my employment simpler, etc.  Even where there are minutes
> available, the "is reviewing this feasible for me?" question kicks in.
>
> Because AIX is relatively narrow in the scope of folks it impacts, the
> average, overcommitted developer is likely to see a bug report, or even a
> pull request, which makes stuff build on AIX and say, "Hmm, I don't know
> enough to evalute that one, I'll leave it to folks who do know (and by
> implication, who have some skin in the game)."  Even for more
> consumer-focused platforms, it has historically been harder to get
> attention for bugs / patches which affect only a single platform (Windows
> file locking semantics, or the Mac installer, etc.)
>
> One key way to get past that hurdle is to slice the size of each "thing"
> down as fine as possible:  e.g., a pull request adding a single "#ifdef
> AIX" block to one file.  Anything more than a screenful of diff is likely
> to trigger the "let someone else review it" pattern, whereas being able
> to scan the patch at a glance lets even a non-itchy reviewer decide,
> "well, at least it can't hurt anything, give it a shot."
>
> Once you've gotten a number of those small patches merged, you will find
> that you've built a relationship with the folks who have been reviewing
> them, and that they are more likely to pass them, and to review larger
> ones, at least in part because *you* will have learned more about what is
> needed in terms of code style, documentation, test coverage, etc., and
> *they* will have learned to trust your judgement.
>
> I'm sorry it isn't easier,

I have thought of writing an almost verbatim post w.r.t. my efforts to
get Cygwin re-supported (which was never officially un-supported
either).  Victor asked me to set up a buildbot for Cygwin as a
prerequisite to much else, which I have done [1].  But it has been
turning out broken build after broken build and is all but useless
since, even at the time of setting it up, I pointed out that there are
two major blocker issues [2] [3] that prevent an even
partially-working build.  Naoki Inada provided some review of the
first one a while ago, and while we had some (I think valid)
disagreement on how to proceed, I updated the issue description with a
checklist of issues he raised that need some clear direction on how
they should be resolved (since at least on some of them we disagreed).
I'd be happy to do pretty much whatever so long as I knew it was
meeting a core dev's requirements while also meeting my own
requirements.

Obviously I'm sympathetic to the limited time and attention of core
devs--I am a maintainer on several projects myself and I know how it
goes, and I have tried not to make too much of a fuss about it.  But
there's a chicken-egg problem in the specific area of platform
support, which feels a little different from just "I need my pet bug
fixed", for someone who is not already a core developer: In order to
make any progress on the issue we need at least one core dev who is
interested in the same platform.  But if we're the only ones willing
to do the work who know or care anything about that platform, how are
we supposed to progress in the first place?

I, like Michael Felt, have a number of fixes waiting in the wings but
can't really progress until a little bit of bare minimum ground work
is at least done to get us up and running.

Michael, if there are any PRs you want to point me to that I might be
able to help review please do.  I don't know anything about AIX either
and am not a core dev so I can't have a final say.  But I've been
hacking on CPython for a long time anyways, and might be able to help
at least with some initial review.


[1] https://buildbot.python.org/all/#/builders/164
[2] https://github.com/python/cpython/pull/4348
[3] https://github.com/python/cpython/pull/8712
___
Python-Dev mailing list
Python-Dev@python.o

Re: [Python-Dev] dear core-devs

2018-10-03 Thread Erik Bray
On Tue, Oct 2, 2018 at 6:41 PM Simon Cross
 wrote:
>
> Are there any core devs that Michael or Erik could collaborate with?
> Rather than rely on adhoc patch review from random core developers.
>
> Michael and Eric: Question -- are you interested in becoming core
> developers at least for the purposes of maintaining these platforms in
> future?

I would be for the purposes of said platform maintenance.  I believe I
already have some maintainer permissions on bpo for exactly this
reason.

That said, while I'm sure it would help, I'm not exactly sure what it
would solve either.  I believe strongly in code review, and just
having a "core developer" status does not necessarily free one from
responsibility for obtaining code review.

It also partly depends on the issue.  If it's a change that touches
other parts of the code in ways that could impact it beyond the narrow
scope of platform support, I believe it definitely should get a second
pair of eyes.  Unfortunately many of the outstanding patches I have
for review fall in that category.  Though in the future there will be
fewer like that.  The majority of work needed for Cygwin, at least, is
tweaking some areas of the tests that make assumptions that don't
necessarily hold on that platform.*

Thanks,
E


* For example, there are some tests that assume there is a user with
UID 0.  While UID 0 is reserved for a "superuser", I don't know that
there's any requirement that such a user *must* exist (on Cygwin it
does not :)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] dear core-devs

2018-10-03 Thread Erik Bray
On Tue, Oct 2, 2018 at 8:54 PM Michael Felt  wrote:
>
>
>
> On 10/2/2018 4:45 PM, Erik Bray wrote:
> > Michael, if there are any PRs you want to point me to that I might be
> > able to help review please do.
> A little trick I learned:
> https://github.com/python/cpython/pulls?q=is%3Aopen+is%3Apr+author%3Aaixtools+sort%3Aupdated-desc
> lists them all.

Cool, I'll have a look.

> What "flipped my switch" yesterday was discovering a PR that I was
> gifted (by an ex? core-dev) and put in the system back in January is now
> broken by a patch merged about two weeks ago. Worse, pieces of
> test_ctypes(bitfields) that previously worked when using __xlc__ seem to
> be broken. Which highlighted the "time pressure" of getting tests to
> pass so that regressions can be seen.

Yes, that can certainly happen.  I have many PRs floating around on
different projects that, you know, get stalled for months and months
and inevitably break.  It's extremely frustrating, but we've all been
there :)

> If you let me know what info you would need (I gave lots of debug info
> two years ago to get that initial fix).
>
> And, I guess the other "larger" change re: test_distutils. Also, some
> issues specific to xlc being different from gcc.
>
> Those two do not show on the gccfarm buildbot.
>
> Many thanks for the offer! I'll try to not take more than the hand offered!
> >   I don't know anything about AIX either
> > and am not a core dev so I can't have a final say.  But I've been
> > hacking on CPython for a long time anyways, and might be able to help
> > at least with some initial review.

I also about to ask if you have a buildbot for AIX, and I see now you
have several. So step in the right direction!
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] BDFL delegation for PEP 426 (PyPI metadata 1.3)

2013-02-15 Thread Erik Bray
On Sun, Feb 3, 2013 at 5:24 PM, Vinay Sajip  wrote:
> Éric Araujo  netwok.org> writes:
>
>> Looks like we agree that a basic tool able to bootstrap the packaging
>> story is needed :)
>
> Agreed. Just because distutils can't easily/reliably build things that are
> better built with SCons/WAF/tup/whatever, doesn't mean that we shouldn't have
> the ability to build pure-Python distributions and distributions including C
> libs and extensions, with the ability to extend easily by third-party tools. 
> It
> just needs to be done in a way which is easy to build on, so the included
> battery stays small and simple. Easier said than done, I know :-)
>
> Regards,
>
> Vinay Sajip

Sorry to revive an old-ish discussion--I'm just catching up on things.
 But I just wanted to add that distutils is still pretty okay for
building reasonably complex projects.  Although it does not rise to
the level of complexity of Numpy or SciPy, the Astropy project
(https://github.com/astropy/astropy) has managed to put together a
pretty nice build system on top of mostly-plain distutils (it does use
distribute but primarily just for 2to3 support).


This has necessitated a number of hacks to overcome shortcomings and
bugs in distutils, but most of those shortcomings could probably be
fixed in distutils within the framework of a slightly lifted freeze.
But in any case I haven't found it worthwhile to switch to something
like bento when the batteries included in the stdlib have been mostly
Good Enough. Having fewer installation dependencies has also made it
significantly easier for non-advanced users to install. Even the
distribute requirement doesn't add too much overhead, as most users
have it on their systems by default now, and for those who don't
distribute_setup.py works okay.

TL;DR, strong -1 on the stdlib "getting out of the build business".
Also as I think Nick already mentioned one of the wins of
Setup-Requires-Dist is to have a standard way to bring in extra build
requirements (such as bento) so if we have better support for a
feature like that it's not necessary to "bless" any preferred tool.

Erik
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Planning on removing cache invalidation for file finders

2013-03-02 Thread Erik Bray
On Sat, Mar 2, 2013 at 10:36 AM, Nick Coghlan  wrote:
> In addition, it may be appropriate for importlib to offer a
> "write_module" method that accepts (module name, target path,
> contents). This would:
>
> 1. Allow in-process caches to be invalidated implicitly and
> selectively when new modules are created
> 2. Allow importers to abstract write access in addition to read access
> 3. Allow the import system to complain at time of writing if the
> desired module name and target path don't actually match given the
> current import system state.

+1 to write_module().  This would be useful in general, I think.
Though perhaps the best solution to the original problem is to more
forcefully document: "If you're writing a module and expect to be able
to import it immediately within the same process, it's necessary to
manually invalidate the directory cache."

I might go a little further and suggest adding a function to only
invalidate the cache for the relevant directory (the proposed
write_module() function could do this).  This can already be done with
something like:

dirname = os.path.dirname(module_filename)
sys.path_importer_cache[dirname].invalidate_caches()

But that's a bit onerous considering that this wasn't even necessary
before 3.3.  There should be an easier way to do this, as there's no
sense in invalidating all the directory caches if one is only writing
new modules to a specific directory or directories.

Erik
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Possible bug in class-init, lookin for mentors

2017-05-02 Thread Erik Bray
On Fri, Apr 21, 2017 at 12:09 PM, Justus Schwabedal
 wrote:
> Hi everyone,
>
> I possibly found a bug in class initialization and would like to fix it.
> Because it's my first journey to core-dev, I would really appreciate the
> help of a mentor that I may ask a few questions to get me up to speed.
>
> To my person, I have previously worked on larger projects in python, c, and
> c++ if that information helps, and I'm really curious to learn more about
> the interiors of the greatest interpreter known to wo-/men.
>
> Here comes the bug-producing example:
>
> `class Foo:
> def __init__(self, bar=[]):
> self.list = bar
>
> spam_1 = Foo()
> spam_2 = Foo()
>
> spam_1.list.append(42)
> print(spam_2.list)`
>
> At least I think it's a bug.  Maybe it's a feature..

Sorry to resurrect an old-ish thread; I haven't looked at Python-dev
in several weeks.  I just wanted to point out that while everyone on
this thread pointed out how this isn't a bug (clear to anyone who's
spent enough time with Python), we have here an experienced C/C++
developer who is interested in helping out on Python core devel, and
no one took him up on that offer.

Jus, for what it's worth, there are a slew of *actual* Python bugs to
be worked on--many languishing for years--due to lack of available
developer time.  You can have a good look at the (daunting) list of
bugs at the old bugs.python.org [1].  IIUC Python development is
slowly moving over to GitHub, but the issue list hasn't migrated yet
so that would still be the place to start.

If you find a bug that looks worth your time to try to fix, you should
probably follow up on the issue for that bug itself.  I can't make any
promises anyone will have time for mentorship, but I'd be willing to
point you in the right direction.  I'm not a core developer but I know
Python internals reasonably well and might be able to help *depending*
on the issue.

Best,
Erik

[1] http://bugs.python.org/
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of Python buildbots

2017-05-03 Thread Erik Bray
On Wed, May 3, 2017 at 10:22 AM, Victor Stinner
 wrote:
> Hi,
>
> I spent last week working on fixing buildbots:
>
>https://www.python.org/dev/buildbot/
>

Thanks!

> * Add more buildbots! Zachary Ware proposed to add a buildbot running
> "regen-all" to check that generated files are up to date.
>
> * Repeat ;-)

I have been saying for several months now that I would like to set up
a Cygwin buildbot--an important step in making that platform
supportable again.  I now have the infrastructure available to do so
(Windows VM on an OpenStack infrastructure at my university).  I
wanted to wait until the tests on Cygwin were more stable, but since
you allow unstable buildbots I could include it among them for now.
Is the buildbot setup documentation on the wiki page still accurate?

Thanks,
Erik
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 538: Coercing the legacy C locale to a UTF-8 based locale

2017-05-05 Thread Erik Bray
On Thu, May 4, 2017 at 6:25 PM, Antoine Pitrou  wrote:
> On Thu, 4 May 2017 11:24:27 +0900
> INADA Naoki  wrote:
>> Hi, Nick and all core devs who are interested in this PEP.
>>
>> I'm reviewing PEP 538 and I want to accept it in this month.
>> It will reduces much UnicodeError pains which server-side OPs facing.
>> Thank you Nick for working on this PEP.
>>
>> If you have something worrying about this PEP, please post a comment
>> soon.  If you don't have enough time to read entire this PEP, feel free to
>> ask a question about you're worrying.
>
> From my POV, it is problematic that the behaviour outlined in PEP 538
> (see Abstract section) varies depending on the adoption of another PEP
> (PEP 540).
>
> If we want to adopt PEP 538 before pronouncing on PEP 540, then PEP 538
> should remove all points conditional on PEP 540 adoption, and PEP 540
> should later be changed to adopt those removed points as PEP
> 540-specific changes.

This is kind of an aside, but regardless of the dependency
relationship between PEP 538 and 540, given that they kind of go
hand-in-hand would it make sense to rename them--e.g. have PEP 539 and
PEP 540 trade places, since PEP 539 has nothing to do with this and is
awkwardly nestled between them.  Or would that only confuse matters at
this point?

Thanks,
Erik
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 539 (second round): A new C API for Thread-Local Storage in CPython

2017-08-31 Thread Erik Bray
On Thu, Aug 31, 2017 at 10:16 AM, Masayuki YAMAMOTO
 wrote:
> Hi python-dev,
>
> Since Erik started the PEP 539 thread on python-ideas, I've collected
> feedbacks in the discussion and pull-request, and tried improvement for the
> API specification and reference implementation, as the result I think
> resolved issues which pointed out by feedbacks.

Thanks Masayuki for taking the lead on updating the PEP.  I've been
off the ball with it for a while.  In particular the table summarizing
the changes is nice.   I just have a few minor changes to suggest
(typos and such) that I'll make in a pull request.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Decorator syntax

2009-09-02 Thread Erik Bray
On Wed, Sep 2, 2009 at 10:35 AM, James Y Knight wrote:
> On Sep 2, 2009, at 6:15 AM, Rob Cliffe wrote:
>
>> So - the syntax restriction seems not only inconsistent, but pointless; it
>> doesn't forbid anything, but merely means we have to do it in a slightly
>> convoluted (unPythonesque) way.  So please, Guido, will you reconsider?
>
> Indeed, it's a silly inconsistent restriction. When it was first added I too
> suggested that any expression be allowed after the @, rather than having a
> uniquely special restricted syntax. I argued from consistency of grammar
> standpoint. But Guido was not persuaded. Good luck to you. :)
>
> Here's some of the more relevant messages from the thread back when the
> @decorator feature was first introduced:
> http://mail.python.org/pipermail/python-dev/2004-August/046654.html
> http://mail.python.org/pipermail/python-dev/2004-August/046659.html
> http://mail.python.org/pipermail/python-dev/2004-August/046675.html
> http://mail.python.org/pipermail/python-dev/2004-August/046711.html
> http://mail.python.org/pipermail/python-dev/2004-August/046741.html
> http://mail.python.org/pipermail/python-dev/2004-August/046753.html
> http://mail.python.org/pipermail/python-dev/2004-August/046818.html

I think Guido may have a point about not allowing any arbitrary
expression.  But I do think that if it allows calls, it should also at
least support the itemgetter syntax, for which there seems to be a
demonstrable use case.  But that's just adding on another special
case, so it might be simpler to allow arbitrary expressions.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Edits to Metadata 1.2 to add extras (optional dependencies)

2012-09-07 Thread Erik Bray
On Mon, Aug 27, 2012 at 10:56 AM, Daniel Holth  wrote:
> On Wed, Aug 15, 2012 at 10:49 AM, Daniel Holth  wrote:
>> I've drafted some edits to Metadata 1.2 with valuable feedback from
> ...
>> (full changeset on 
>> https://bitbucket.org/dholth/python-peps/changeset/537e83bd4068)
>
> Metadata 1.2 is nearly 8 years old and it's Accepted but not Final. Is
> it better to continue editing it, or create a new PEP for Metadata
> 1.3?

Somehow I completely overlooked this thread until now.  Thanks Daniel
for getting the ball rolling on this.  There have already been many
bytes spilled on metadata extensions, and although I agree it would be
enormously useful to build an extension mechanism into the metadata
format, I don't have much riding on that, or much more to add that
hasn't been said.

There hasn't been much said about Setup-Requires-Dist, so I'm guessing
it's uncontroversial.  But since that's sort of my hobbyhorse I
thought I would make a comment on it.  The thing I love about the
Setup-Requires-Dist feature is that, if properly supported by
different installers, it can free those installers from a fair bit of
responsibility.

For example, in greatly simplifies the thorny issue of "compilers".
The existing compiler support in distutils, while not without its
problems, does work in most cases for building common C-extensions.
distutils2 has already made some progress on cleaning up the interface
for compilers, and making it easier to register new compiler classes
that can be imported from an arbitrary package.  This allows projects
with special needs (such as Fortran compiler support) to ship their
own compiler class with the project.  Or if there's a good enough
third-party package that provides Fortran compiler support, projects
may use it in their build process.  Support for Setup-Requires-Dist
ensures that a third-party compiler package can be made available at
build-time.

What's great about this, is that even if the stdlib still includes a
build system, it doesn't necessarily have to anticipate every possible
need for building every kind of project (it should, at a minimum, be
able to build pure-Python projects).  If someone wants to add MSVC2012
support they can do that as a third-party package. One could even
create "compilers" for other build systems like waf, or even provide
an entry-point to meta-build systems like bento.  Am I making sense?

Erik
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] packaging location ?

2012-09-13 Thread Erik Bray
On Thu, Sep 13, 2012 at 5:38 AM, Antoine Pitrou  wrote:
> On Thu, 13 Sep 2012 11:14:17 +1000
> Nick Coghlan  wrote:
>> On Thu, Sep 13, 2012 at 8:43 AM, R. David Murray  
>> wrote:
>> > When the removal was being pondered, the possibility of keeping certain
>> > bits that were more ready than others was discussed.  Perhaps the best
>> > way forward is to put it back in bits, with the most finished (and PEP
>> > relevant) stuff going in first.  That might also give non-packaging
>> > people bite-sized-enough chunks to actually digest and help with.
>>
>> This is the plan I'm going to propose. The previous approach was to
>> just throw the entirety of distutils2 in there, but there are some
>> hard questions that doesn't address, and some use cases it doesn't
>> handle. So, rather than importing it wholesale and making the stdlib
>> the upstream for distutils2, I believe it makes more sense for
>> distutils2 to remain an independent project, and we cherry pick bits
>> and pieces for the standard library's new packaging module as they
>> stabilise.
>
> How is that going to be useful? Most people use distutils / packaging as
> an application, not a library. If you provide only a subset of
> the necessary features, people won't use packaging.

Third-party install/packing software (pip, bento, even distribute) can
still gradually absorb any standard pieces added to the stdlib for
better interoperability and PEP compliance.  I'm still strongly in
favor of a `pysetup` like command making it into Python too, but in
the meantime the top priority should be anything that supports better
consistency across existing projects.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Edits to Metadata 1.2 to add extras (optional dependencies)

2012-09-14 Thread Erik Bray
On Fri, Sep 14, 2012 at 12:30 PM, Daniel Holth  wrote:
> Add to metadata 1.3:
>
> Description-File: README(\..+)?
>
> Meaning the description should be read from a file in the same
> directory as PKG-INFO or METADATA (including in the .dist-info
> directories) and we strongly recommend it be named as README.* and be
> utf-8 encoded text.
>
> Description: is the only multi-line field in the metadata. It is
> almost never needed at runtime. It would be great for performance and
> simplify the parser to just put it in another file.
>
> Mutually exclusive with Description.
>
> May beg for a Summary: tag with a one-line description.

Can we make Description-File multiple-use?  The meaning of this would
be that the Description is formed from concatenating each
Description-File in order.  That raises the question: Is ordering
guaranteed for multiple-use fields?

I ask, because distutils2 supports exactly such a feature, and I've
found it useful.  For example, if I have a README.rst and a
CHANGELOG.rst I can specify:

description-file =
README.rst
CHANGELOG.rst

Then the full description, contains my readme and my changelog, which
look nice together on PyPI, but I prefer to keep as separate files in
the source.

My only other concern is that if the value of this field can
theoretically be arbitrary, it could conflict with other .dist-info
files.  Does the .dist-info format allow subdirectories?  Placing
description-files in a subdirectory of .dist-info could be a
reasonable workaround.

Erik
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Edits to Metadata 1.2 to add extras (optional dependencies)

2012-09-14 Thread Erik Bray
On Fri, Sep 14, 2012 at 1:57 PM, Daniel Holth  wrote:
> On Fri, Sep 14, 2012 at 1:43 PM, Erik Bray  wrote:
>> On Fri, Sep 14, 2012 at 12:30 PM, Daniel Holth  wrote:
>>> Add to metadata 1.3:
>>>
>>> Description-File: README(\..+)?
>>>
>>> Meaning the description should be read from a file in the same
>>> directory as PKG-INFO or METADATA (including in the .dist-info
>>> directories) and we strongly recommend it be named as README.* and be
>>> utf-8 encoded text.
>>>
>>> Description: is the only multi-line field in the metadata. It is
>>> almost never needed at runtime. It would be great for performance and
>>> simplify the parser to just put it in another file.
>>>
>>> Mutually exclusive with Description.
>>>
>>> May beg for a Summary: tag with a one-line description.
>>
>> Can we make Description-File multiple-use?  The meaning of this would
>> be that the Description is formed from concatenating each
>> Description-File in order.  That raises the question: Is ordering
>> guaranteed for multiple-use fields?
>>
>> I ask, because distutils2 supports exactly such a feature, and I've
>> found it useful.  For example, if I have a README.rst and a
>> CHANGELOG.rst I can specify:
>>
>> description-file =
>> README.rst
>> CHANGELOG.rst
>>
>> Then the full description, contains my readme and my changelog, which
>> look nice together on PyPI, but I prefer to keep as separate files in
>> the source.
>>
>> My only other concern is that if the value of this field can
>> theoretically be arbitrary, it could conflict with other .dist-info
>> files.  Does the .dist-info format allow subdirectories?  Placing
>> description-files in a subdirectory of .dist-info could be a
>> reasonable workaround.
>>
>> Erik
>
> The .dist-info design asks for every metadata file (the one in all
> caps, not any of the other metadata in .dist-info) to be parsed for
> many packaging operations that do not require the description, such as
> resolving the dependency graph of a package. Description-File would
> give an installer the option to pull Description: out into
> Description-File:. I would expect the concatenation to happen before
> this point.

I understand now. In this case why even allow flexibility in the
description file name?  Just make it description.txt, and the
Description-File field just some boolean indicator of whether or not a
description file exists?

Erik
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com