Re: [Python-Dev] How do we tell if we're helping or hindering the core development process?

2015-07-24 Thread Nick Coghlan
On 23 Jul 2015 01:36, "Nikolaus Rath"  wrote:
>
> On Jul 22 2015, Nick Coghlan  wrote:
> > On 22 July 2015 at 13:23, Nikolaus Rath  wrote:
> >> If it were up to me, I'd focus all the resources of the PSF on reducing
> >> this backlog - be that by hiring some core developers to work full-time
> >> on just the open bugtracker issues, or by financing development of
> >> better code review and commit infrastructure.
> >
> > Ah, but the PSF can't do that without infringing on python-dev's
> > autonomy - switching to my PSF Director's hat, while we'd certainly be
> > prepared to help with funding a credible grant proposal for something
> > like the Twisted technical fellowship, we wouldn't *impose* help that
> > the core developers haven't asked for.
>
> I don't understand. If I would hire a core developer myself to work on
> this (theoretically, I have no plans to do that), would that also be
> infringing python-dev's authority? If so, how is that different from me
> doing the work? If not, why is it different if the PSF decides to hire
> someone?

When somebody else pays someone to work on core development, it's quite
clear that that's a private employment matter between that developer and
whoever hires them.

By contrast, the PSF also has to consider the potential impact on
motivation levels for all the current volunteers we *don't* hire, as well
as ensuring that expectations are appropriately aligned between everyone
involved in the process. I think that's more likely to work out well for
all concerned if the process of requesting paid help in keeping the issue
tracker backlog under control is initiated *from* the core development
community, rather than being externally initiated by the PSF Board.

> >> The current situation looks like a downward spiral to me. New
> >> contributors are frustrated and leave because they feel their
> >> contribution is not welcome, and core developers get burned out by
> >> the gigantic backlog and the interaction with frustrated patch
> >> submitters - thus further reducing the available manpower.
> >
> > We actually still have a lot of paid core developer (and potential
> > core developer) time locked up in facilitating the Python 2 -> 3
> > migration, as we didn't fully appreciate the extent to which Python
> > had been adopted in the Linux ecosystem and elsewhere until folks
> > started seeking help upgrading.
>
> Interesting. Is this information available publically somewhere? I'm
> curious what exactly is being worked on.

There are a couple of links for Ubuntu & Fedora porting status at
https://wiki.python.org/moin/Python3LinuxDistroPortingStatus

Canonical & Red Hat between them have several people working on that, and
upgrades for a large proportion of the enterprise Linux world are gated
behind that effort.

The PyCon US sponsor list then provides a decent hint as to the scale of
what's needing to be ported behind corporate firewalls:
https://us.pycon.org/2015/sponsors/

It definitely qualifies as interesting times :)

Cheers,
Nick.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Building python 2.7.10 for Windows from source

2015-07-24 Thread Mark Kelley
I have been using Python for some time but it's been a decade since
I've tried to build it from source, back in the 2.4 days.  Things seem
to have gotten a little more complicated now.

I've read through the PCBuild/README file and got most stuff
compiling.  I find it a little odd that there are special instructions
for the building the release version of tcl/tk.  Is that what the
developers actually do when they cut a release, or is there some
other, top-level script that does this automatically? It just seems
odd.

Anyhow, my specific question is around the distutils wininst stubs,
provided as binaries in the release tarball.  Where can I find the
source files that those binaries are built from?

Many thanks,
Mark.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] [python_default-nightly] Benchmark Results for 3bbd0cbfe836511dd3e05fcc30ffb5bdbfe686ea

2015-07-24 Thread lp_benchmark_robot
Hi Internals, 

This is the first message from Intel's language optimization team.
We would like to provide the Python internals developer community
with a daily service which will monitor latest committed patches
performance regressions against well known workloads.
Our aim is to run a multitude of workloads as well as real-life scenarios
which the community considers relevant. The service will send daily bulletins
containing latest measurements for daily variations and variations against
latest stable release run on our Intel-enabled servers.

The community's feedback is very important for us. For any questions,
comments or suggestions you can also contact us on our mailing list
l...@lists.01.org. You can also check our website: https://www.01.org/lp


Results for project python_default-nightly, build date 2015-07-24 09:02:02 
commit: 3bbd0cbfe836511dd3e05fcc30ffb5bdbfe686ea
revision date:  2015-07-24 07:43:44 
environment:Haswell-EP
cpu:Intel(R) Xeon(R) CPU E5-2699 v3 @ 2.30GHz 2x18 cores, stepping 
2, LLC 45 MB
mem:128 GB
os: CentOS 7.1
kernel: Linux 3.10.0-229.4.2.el7.x86_64

Note: Baseline results were generated using release v3.4.3, with hash
b4cbecbc0781e89a309d03b60a1f75f8499250e6 from 2015-02-25 12:15:33+00:00


   benchmarkunitchange sincechange since
last run  v3.4.3

:-)django_v2 sec1.12735%7.47953%
:-(  pybench sec   -0.53822%   -2.40216%
:-( regex_v8 sec0.61774%   -2.32010%
:-|nbody sec1.75860%   -0.76206%
:-) json_dump_v2 sec2.13422%   -0.56930%


Our lab does a nightly source pull and build of the Python project and measures
performance changes against the previous stable version and the previous nightly
measurement. This is provided as a service to the community so that quality
issues with current hardware can be identified quickly.

Intel technologies' features and benefits depend on system configuration and may
require enabled hardware, software or service activation. Performance varies
depending on system configuration.  No license (express or implied, by estoppel
or otherwise) to any intellectual property rights is granted by this document.
Intel disclaims all express and implied warranties, including without
limitation, the implied warranties of merchantability, fitness for a particular
purpose, and non-infringement, as well as any warranty arising from course of
performance, course of dealing, or usage in trade.  This document may contain
information on products, services and/or processes in development. Contact your
Intel representative to obtain the latest forecast, schedule, specifications and
roadmaps.  The products and services described may contain defects or errors
known as errata which may cause deviations from published specifications.
Current characterized errata are available on request.

(C) 2015 Intel Corporation.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Benchmark Results for Python Default 2015-07-24

2015-07-24 Thread lp_benchmark_robot
Hi Internals, 

This is the first message from Intel's language optimization team.
We would like to provide the Python internals developer community
with a daily service which will monitor latest committed patches
performance regressions against well known workloads.
Our aim is to run a multitude of workloads as well as real-life scenarios
which the community considers relevant. The service will send daily bulletins
containing latest measurements for daily variations and variations against
latest stable release run on our Intel-enabled servers.

The community's feedback is very important for us. For any questions,
comments or suggestions you can also contact us on our mailing list
l...@lists.01.org. You can also check our website: https://www.01.org/lp


Results for project python_default-nightly, build date 2015-07-24 09:02:02 
commit: 3bbd0cbfe836511dd3e05fcc30ffb5bdbfe686ea
revision date:  2015-07-24 07:43:44 
environment:Haswell-EP
cpu:Intel(R) Xeon(R) CPU E5-2699 v3 @ 2.30GHz 2x18 cores, stepping 
2, LLC 45 MB
mem:128 GB
os: CentOS 7.1
kernel: Linux 3.10.0-229.4.2.el7.x86_64

Note: Baseline results were generated using release v3.4.3, with hash
b4cbecbc0781e89a309d03b60a1f75f8499250e6 from 2015-02-25 12:15:33+00:00


   benchmarkunitchange sincechange since
last run  v3.4.3

:-)django_v2 sec1.12735%7.47953%
:-(  pybench sec   -0.53822%   -2.40216%
:-( regex_v8 sec0.61774%   -2.32010%
:-|nbody sec1.75860%   -0.76206%
:-) json_dump_v2 sec2.13422%   -0.56930%


Our lab does a nightly source pull and build of the Python project and measures
performance changes against the previous stable version and the previous nightly
measurement. This is provided as a service to the community so that quality
issues with current hardware can be identified quickly.

Intel technologies' features and benefits depend on system configuration and may
require enabled hardware, software or service activation. Performance varies
depending on system configuration.  No license (express or implied, by estoppel
or otherwise) to any intellectual property rights is granted by this document.
Intel disclaims all express and implied warranties, including without
limitation, the implied warranties of merchantability, fitness for a particular
purpose, and non-infringement, as well as any warranty arising from course of
performance, course of dealing, or usage in trade.  This document may contain
information on products, services and/or processes in development. Contact your
Intel representative to obtain the latest forecast, schedule, specifications and
roadmaps.  The products and services described may contain defects or errors
known as errata which may cause deviations from published specifications.
Current characterized errata are available on request.

(C) 2015 Intel Corporation.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Building python 2.7.10 for Windows from source

2015-07-24 Thread Zachary Ware
On Jul 24, 2015 8:30 AM, "Mark Kelley"  wrote:
>
> I have been using Python for some time but it's been a decade since
> I've tried to build it from source, back in the 2.4 days.  Things seem
> to have gotten a little more complicated now.
>
> I've read through the PCBuild/README file and got most stuff
> compiling.  I find it a little odd that there are special instructions
> for the building the release version of tcl/tk.  Is that what the
> developers actually do when they cut a release, or is there some
> other, top-level script that does this automatically? It just seems
> odd.

That used to be standard procedure, yes. However, I just recently
backported the project files from 3.5, which include project files for
building Tcl/Tk and Tix, in both Debug and Release configurations, so I may
have missed some stuff that could be removed from PCbuild/readme.txt. You
do need some extra stuff to build 2.7 with its new project files, though
(which i know is now covered in readme.txt). There hasn't been a release
with those project files yet though, they're just in the hg repo.

> Anyhow, my specific question is around the distutils wininst stubs,
> provided as binaries in the release tarball.  Where can I find the
> source files that those binaries are built from?

I believe the source for those is in PC/bdist_wininst/, or some very
similar path.

Hope this helps,
--
Zach
(On a phone)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Benchmark Results for Python Default 2015-07-24

2015-07-24 Thread Serhiy Storchaka

On 24.07.15 15:34, lp_benchmark_robot wrote:

Hi Internals,

This is the first message from Intel's language optimization team.
We would like to provide the Python internals developer community
with a daily service which will monitor latest committed patches
performance regressions against well known workloads.
Our aim is to run a multitude of workloads as well as real-life scenarios
which the community considers relevant. The service will send daily bulletins
containing latest measurements for daily variations and variations against
latest stable release run on our Intel-enabled servers.

The community's feedback is very important for us. For any questions,
comments or suggestions you can also contact us on our mailing list
l...@lists.01.org. You can also check our website: https://www.01.org/lp


It is cool! Thank you, it whats we need.

But perhaps it would be better to post these reports on
python-check...@python.org instead of Python-Dev list, with Reply-To set 
to python-dev@python.org.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 447 (type.__getdescriptor__)

2015-07-24 Thread Nick Coghlan
On 23 July 2015 at 03:12, Steve Dower  wrote:
> Terry Reedy wrote:
>> On 7/22/2015 3:25 AM, Ronald Oussoren wrote:
>>> Hi,
>>>
>>> Another summer with another EuroPython, which means its time again to
>>> try to revive PEP 447…
>>>
>>> I’ve just pushes a minor update to the PEP and would like to get some
>>> feedback on this, arguably fairly esoteric, PEP.
>>
>> Yeh, a bit too esoteric for most of us to review. For instance, it is not
>> obvious to me, not familiar with internal details, after reading the intro, 
>> why
>> a custom __getattribute__ is not enough and why __getdescriptor__ would be
>> needed. If Guido does not want to review this, you need to find a PEP BDFL 
>> for
>> this.
>>
>> There are two fairly obvious non-esoteric questions:
>>
>> 1. How does this impact speed (updated section needed)?
>
> Agreed, this is important. But hopefully it's just a C indirection (or better 
> yet, a null check) for objects that don't override __getdescriptor__.
>
>> 2. Is this useful, that you can think of, for anything other than connecting 
>> to
>> Objective C?
>
> There are other object models that would benefit from this, but I don't 
> recall that we came up with uses other than "helps proxy to objects where 
> listing all members eagerly is expensive and/or potentially incorrect". Maybe 
> once you list all the operating systems that are now using dynamic 
> object-oriented APIs rather than flat APIs (Windows, iOS, Android, ... 
> others?) this is good enough?

"better bridging to other languages and runtimes" is a good enough
rationale for me, although I also wonder if it might be useful for
making some interesting COM and dbus based API wrappers.

Ronald, could you dig up a reference to the last thread (or threads)
on this? My recollection is that we were actually pretty happy with
it, and it was just set aside through lack of time to push it through
to completion.

Regards,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Benchmark Results for Python Default 2015-07-24

2015-07-24 Thread Nick Coghlan
On 24 July 2015 at 23:55, Serhiy Storchaka  wrote:
> On 24.07.15 15:34, lp_benchmark_robot wrote:
>> The community's feedback is very important for us. For any questions,
>> comments or suggestions you can also contact us on our mailing list
>> l...@lists.01.org. You can also check our website: https://www.01.org/lp
>
> It is cool! Thank you, it whats we need.

Indeed!

> But perhaps it would be better to post these reports on
> python-check...@python.org instead of Python-Dev list, with Reply-To set to
> python-dev@python.org.

Aye, python-checkins is a better option for automated daily posts.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 447 (type.__getdescriptor__)

2015-07-24 Thread Ronald Oussoren

> On 24 Jul 2015, at 16:17, Nick Coghlan  wrote:
> 
> On 23 July 2015 at 03:12, Steve Dower  > wrote:
>> Terry Reedy wrote:
>>> On 7/22/2015 3:25 AM, Ronald Oussoren wrote:
 Hi,
 
 Another summer with another EuroPython, which means its time again to
 try to revive PEP 447…
 
 I’ve just pushes a minor update to the PEP and would like to get some
 feedback on this, arguably fairly esoteric, PEP.
>>> 
>>> Yeh, a bit too esoteric for most of us to review. For instance, it is not
>>> obvious to me, not familiar with internal details, after reading the intro, 
>>> why
>>> a custom __getattribute__ is not enough and why __getdescriptor__ would be
>>> needed. If Guido does not want to review this, you need to find a PEP BDFL 
>>> for
>>> this.
>>> 
>>> There are two fairly obvious non-esoteric questions:
>>> 
>>> 1. How does this impact speed (updated section needed)?
>> 
>> Agreed, this is important. But hopefully it's just a C indirection (or 
>> better yet, a null check) for objects that don't override __getdescriptor__.
>> 
>>> 2. Is this useful, that you can think of, for anything other than 
>>> connecting to
>>> Objective C?
>> 
>> There are other object models that would benefit from this, but I don't 
>> recall that we came up with uses other than "helps proxy to objects where 
>> listing all members eagerly is expensive and/or potentially incorrect". 
>> Maybe once you list all the operating systems that are now using dynamic 
>> object-oriented APIs rather than flat APIs (Windows, iOS, Android, ... 
>> others?) this is good enough?
> 
> "better bridging to other languages and runtimes" is a good enough
> rationale for me, although I also wonder if it might be useful for
> making some interesting COM and dbus based API wrappers.
> 
> Ronald, could you dig up a reference to the last thread (or threads)
> on this? My recollection is that we were actually pretty happy with
> it, and it was just set aside through lack of time to push it through
> to completion.

I’ll do some digging in my archives. From what I recall you and Steve were 
positive the last time around and others didn’t have much to add at the time.

FWIW Guido was positive about the idea, but would really like to see up to date 
benchmark results and some specific micro benchmarking to see if the change has 
negative performance impact.

I do have a API design question now that I’m working on this again: the PEP 
proposed to add a __getdescriptor__ method to the meta type, that is you’d 
define it as:

   class MyMeta (type):
def __getdescriptor__(self, name): …

   class MyType (object, metaclass=MyMeta):
   pass

This doesn’t match how other special slots are done, in particular __new__. I’d 
like to switch the definition to:


   class MyType:

   @classmethod
   def __getdescriptor__(cls, name): …

I have two questions about that: (1) is this indeed a better interface and (2) 
should users explicitly use the classmethod decorator or would it be better to 
match the behaviour for __new__ by leaving that out? Personally I do think 
that this is a better interface, but am not sure about requiring the decorator.

Ronald

P.S. Fighting with refcounting between sessions, forward porting of the patch 
for this PEP seems to have introduced a refcount problem. Nothing that cannot 
be fixed during the sprints though.


> 
> Regards,
> Nick.
> 
> -- 
> Nick Coghlan   |   ncogh...@gmail.com    |   
> Brisbane, Australia
> ___
> Python-Dev mailing list
> Python-Dev@python.org 
> https://mail.python.org/mailman/listinfo/python-dev 
> 
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/ronaldoussoren%40mac.com 
> 
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Benchmark Results for Python Default 2015-07-24

2015-07-24 Thread Brett Cannon
Should we discuss of these are the benchmarks we want daily reports on (you
can see what the benchmark suite has at
https://hg.python.org/benchmarks/file/2979f5ce6a0c/perf.py#l2243 )? I
personally would prefer dropping pybench and replacing it with a startup
measurement.

On Fri, Jul 24, 2015, 07:23 Nick Coghlan  wrote:

> On 24 July 2015 at 23:55, Serhiy Storchaka  wrote:
> > On 24.07.15 15:34, lp_benchmark_robot wrote:
> >> The community's feedback is very important for us. For any questions,
> >> comments or suggestions you can also contact us on our mailing list
> >> l...@lists.01.org. You can also check our website: https://www.01.org/lp
> >
> > It is cool! Thank you, it whats we need.
>
> Indeed!
>
> > But perhaps it would be better to post these reports on
> > python-check...@python.org instead of Python-Dev list, with Reply-To
> set to
> > python-dev@python.org.
>
> Aye, python-checkins is a better option for automated daily posts.
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/brett%40python.org
>
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 447 (type.__getdescriptor__)

2015-07-24 Thread Brett Cannon
 On Fri, Jul 24, 2015, 07:43 Ronald Oussoren  wrote:

On 24 Jul 2015, at 16:17, Nick Coghlan  wrote:

On 23 July 2015 at 03:12, Steve Dower  wrote:

 Terry Reedy wrote:

 On 7/22/2015 3:25 AM, Ronald Oussoren wrote:

  Hi,

Another summer with another EuroPython, which means its time again to
try to revive PEP 447…

I’ve just pushes a minor update to the PEP and would like to get some
feedback on this, arguably fairly esoteric, PEP.

   Yeh, a bit too esoteric for most of us to review. For instance, it is not
obvious to me, not familiar with internal details, after reading the intro,
why
a custom __getattribute__ is not enough and why __getdescriptor__ would be
needed. If Guido does not want to review this, you need to find a PEP BDFL
for
this.

There are two fairly obvious non-esoteric questions:

1. How does this impact speed (updated section needed)?

  Agreed, this is important. But hopefully it's just a C indirection (or
better yet, a null check) for objects that don't override __getdescriptor__.

 2. Is this useful, that you can think of, for anything other than
connecting to
Objective C?

  There are other object models that would benefit from this, but I don't
recall that we came up with uses other than "helps proxy to objects where
listing all members eagerly is expensive and/or potentially incorrect".
Maybe once you list all the operating systems that are now using dynamic
object-oriented APIs rather than flat APIs (Windows, iOS, Android, ...
others?) this is good enough?

  "better bridging to other languages and runtimes" is a good enough
rationale for me, although I also wonder if it might be useful for
making some interesting COM and dbus based API wrappers.

Ronald, could you dig up a reference to the last thread (or threads)
on this? My recollection is that we were actually pretty happy with
it, and it was just set aside through lack of time to push it through
to completion.

 I’ll do some digging in my archives. From what I recall you and Steve were
positive the last time around and others didn’t have much to add at the
time.

FWIW Guido was positive about the idea, but would really like to see up to
date benchmark results and some specific micro benchmarking to see if the
change has negative performance impact.

I do have a API design question now that I’m working on this again: the PEP
proposed to add a __getdescriptor__ method to the meta type, that is you’d
define it as:

   class MyMeta (type):

def __getdescriptor__(self, name): …

   class MyType (object, metaclass=MyMeta):

   pass

This doesn’t match how other special slots are done, in particular __new__.
I’d like to switch the definition to:

   class MyType:

   @classmethod

   def __getdescriptor__(cls, name): …

I have two questions about that: (1) is this indeed a better interface and
(2) should users explicitly use the classmethod decorator or would it be
better to match the behaviour for __new__ by leaving that out?
Personally I do think that this is a better interface, but am not sure
about requiring the decorator.


Leave the decorator out like __new__, otherwise people are bound to forget
it and have a hard time debugging why their code doesn't work.

-Brett


Ronald

P.S. Fighting with refcounting between sessions, forward porting of the
patch for this PEP seems to have introduced a refcount problem. Nothing
that cannot be fixed during the sprints though.


Regards,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev


 Unsubscribe:
https://mail.python.org/mailman/options/python-dev/ronaldoussoren%40mac.com

 ___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
https://mail.python.org/mailman/options/python-dev/brett%40python.org
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Building python 2.7.10 for Windows from source

2015-07-24 Thread Steve Dower
Those files should be under PC folder.

Building 3.5 and onwards is a much more pleasant experience, and many of those 
improvements have been backported for 2.7.11.

Cheers,
Steve

Top-posted from my Windows Phone

From: Mark Kelley
Sent: ‎7/‎24/‎2015 6:30
To: Python-Dev@python.org
Subject: [Python-Dev] Building python 2.7.10 for Windows from source

I have been using Python for some time but it's been a decade since
I've tried to build it from source, back in the 2.4 days.  Things seem
to have gotten a little more complicated now.

I've read through the PCBuild/README file and got most stuff
compiling.  I find it a little odd that there are special instructions
for the building the release version of tcl/tk.  Is that what the
developers actually do when they cut a release, or is there some
other, top-level script that does this automatically? It just seems
odd.

Anyhow, my specific question is around the distutils wininst stubs,
provided as binaries in the release tarball.  Where can I find the
source files that those binaries are built from?

Many thanks,
Mark.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/steve.dower%40microsoft.com
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 447 (type.__getdescriptor__)

2015-07-24 Thread Nick Coghlan
On 25 July 2015 at 00:50, Brett Cannon  wrote:
> Leave the decorator out like __new__, otherwise people are bound to forget
> it and have a hard time debugging why their code doesn't work.

I'd actually advocate for keeping this as a metaclass method, rather
than making it available to any type instance. The key thing to
consider for me is "What additional power does making it a method on
the class itself grant to mixin types?"

With PEP 487, the __init_subclass__ proposal only grants mixins the
power to implicitly run additional code when new subclasses are
defined. They have no additional ability to influence the behaviour of
the specific class adding the mixin into the inheritance hierarchy.

With PEP 447, as currently written, a mixin that wants to alter how
descriptors are looked up will be able to do so implicitly as long as
there are no other custom metaclasses in the picture. As soon as there
are *two* custom metaclasses involved, you'll get an error at
definition time and have to sort out how you want the metaclass
inheritance to work and have a chance to notice if there are two
competing __getdescriptor__ implementations.

However, if __getdescriptor__ moves to being a class method on object
rather than an instance method on type, then you'll lose that
assistance from the metaclass checker - if you have two classes in
your MRO with mutually incompatible __getdescriptor__ implementations,
you're likely to be in for a world of pain as you try to figure out
the source of any related bugs.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Summary of Python tracker Issues

2015-07-24 Thread Python tracker

ACTIVITY SUMMARY (2015-07-17 - 2015-07-24)
Python tracker at http://bugs.python.org/

To view or respond to any of the issues listed below, click on the issue.
Do NOT respond to this message.

Issues counts and deltas:
  open4957 (+10)
  closed 31511 (+43)
  total  36468 (+53)

Open issues with patches: 2254 


Issues opened (34)
==

#23591: Add IntFlags
http://bugs.python.org/issue23591  reopened by r.david.murray

#24619: async/await parser issues
http://bugs.python.org/issue24619  reopened by yselivanov

#24653: Mock.assert_has_calls([]) is surprising for users
http://bugs.python.org/issue24653  reopened by rbcollins

#24657: CGIHTTPServer module discard continuous '/' letters from param
http://bugs.python.org/issue24657  opened by takayuki

#24658: open().write() fails on 2 GB+ data (OS X)
http://bugs.python.org/issue24658  opened by lebigot

#24659: dict() built-in fails on iterators with a "keys" attribute
http://bugs.python.org/issue24659  opened by christian.barcenas

#24661: CGIHTTPServer: premature unescaping of query string
http://bugs.python.org/issue24661  opened by johnseman

#24665: CJK support for textwrap
http://bugs.python.org/issue24665  opened by fgallaire

#24666: Buffered I/O does not take file position into account when rea
http://bugs.python.org/issue24666  opened by ericpruitt

#24667: OrderedDict.popitem()/__str__() raises KeyError
http://bugs.python.org/issue24667  opened by xZise

#24668: Deprecate 0 as a synonym for 0
http://bugs.python.org/issue24668  opened by steven.daprano

#24670: os.chdir breaks result of os.path.abspath(__file__) and os.pat
http://bugs.python.org/issue24670  opened by LordBlick

#24671: idlelib 2.7: finish converting print statements
http://bugs.python.org/issue24671  opened by terry.reedy

#24672: shutil.rmtree failes on non ascii filenames
http://bugs.python.org/issue24672  opened by Steffen Kampmann

#24673: distutils/_msvccompiler does not remove /DLL during link(CComp
http://bugs.python.org/issue24673  opened by James Salter

#24674: pyclbr not recursively showing classes in packages
http://bugs.python.org/issue24674  opened by worenklein

#24681: Put most likely test first in set_add_entry()
http://bugs.python.org/issue24681  opened by rhettinger

#24682: Add Quick Start: Communications section to devguide
http://bugs.python.org/issue24682  opened by willingc

#24683: Type confusion in json encoding
http://bugs.python.org/issue24683  opened by pkt

#24684: socket.getaddrinfo(host) doesn't ensure that host.encode() ret
http://bugs.python.org/issue24684  opened by pkt

#24685: collections.OrderedDict collaborative subclassing
http://bugs.python.org/issue24685  opened by eric.frederich

#24686: zipfile is intolerant of extra bytes
http://bugs.python.org/issue24686  opened by Devin Fisher

#24689: Add tips for effective online communication to devguide
http://bugs.python.org/issue24689  opened by willingc

#24691: out of memory in distutils.upload with large files
http://bugs.python.org/issue24691  opened by Jan.Stürtz

#24692: types.coroutines() idempotence documentation
http://bugs.python.org/issue24692  opened by seirl

#24693: zipfile: change RuntimeError to more appropriate exception typ
http://bugs.python.org/issue24693  opened by serhiy.storchaka

#24696: Don't use None as sentinel for traceback
http://bugs.python.org/issue24696  opened by Drekin

#24697: Add CoroutineReturn and CoroutineExit builtin exceptions for c
http://bugs.python.org/issue24697  opened by yselivanov

#24698: get_externals.bat script fails
http://bugs.python.org/issue24698  opened by Alex Budovski

#24699: TemporaryDirectory is cleaned up twice
http://bugs.python.org/issue24699  opened by Ilya.Kulakov

#24700: array compare is hideously slow
http://bugs.python.org/issue24700  opened by swanson

#24705: sysconfig._parse_makefile doesn't expand ${} vars appearing be
http://bugs.python.org/issue24705  opened by doko

#24706: poplib: Line too long error causes knock-on failure to retriev
http://bugs.python.org/issue24706  opened by Chris Smowton

#24707: Assertion failed in pymonotonic_new
http://bugs.python.org/issue24707  opened by berker.peksag



Most recent 15 issues with no replies (15)
==

#24707: Assertion failed in pymonotonic_new
http://bugs.python.org/issue24707

#24706: poplib: Line too long error causes knock-on failure to retriev
http://bugs.python.org/issue24706

#24696: Don't use None as sentinel for traceback
http://bugs.python.org/issue24696

#24693: zipfile: change RuntimeError to more appropriate exception typ
http://bugs.python.org/issue24693

#24691: out of memory in distutils.upload with large files
http://bugs.python.org/issue24691

#24673: distutils/_msvccompiler does not remove /DLL during link(CComp
http://bugs.python.org/issue24673

#24666: Buffered I/O does not take file position into account when rea
http://bugs.python.org/issue24666

#24657: CGIHTTPServer module discard continuous '

Re: [Python-Dev] PEP 447 (type.__getdescriptor__)

2015-07-24 Thread Ronald Oussoren

> On 24 Jul 2015, at 17:29, Nick Coghlan  wrote:
> 
> On 25 July 2015 at 00:50, Brett Cannon  wrote:
>> Leave the decorator out like __new__, otherwise people are bound to forget
>> it and have a hard time debugging why their code doesn't work.
> 
> I'd actually advocate for keeping this as a metaclass method, rather
> than making it available to any type instance. The key thing to
> consider for me is "What additional power does making it a method on
> the class itself grant to mixin types?”

To be honest, I hadn’t considered mixin types yet. 

> 
> With PEP 487, the __init_subclass__ proposal only grants mixins the
> power to implicitly run additional code when new subclasses are
> defined. They have no additional ability to influence the behaviour of
> the specific class adding the mixin into the inheritance hierarchy.
> 
> With PEP 447, as currently written, a mixin that wants to alter how
> descriptors are looked up will be able to do so implicitly as long as
> there are no other custom metaclasses in the picture. As soon as there
> are *two* custom metaclasses involved, you'll get an error at
> definition time and have to sort out how you want the metaclass
> inheritance to work and have a chance to notice if there are two
> competing __getdescriptor__ implementations.
> 
> However, if __getdescriptor__ moves to being a class method on object
> rather than an instance method on type, then you'll lose that
> assistance from the metaclass checker - if you have two classes in
> your MRO with mutually incompatible __getdescriptor__ implementations,
> you're likely to be in for a world of pain as you try to figure out
> the source of any related bugs.

That’s a good point, and something that will move something that I’ve 
wanted to look into forward on my list: the difference between a
classmethod and a method on the class defined through a metaclass.

The semantics I’d like to have is that __getdescriptor__ is a local decision,
defining __getdescriptor__ for a class should only affect that class and its
subclass, and shouldn’t affect how superclasses are handled by __getattribute__.
That is something that can be done by defining __getdescriptor__ on a metaclass,
and AFAIK requires active cooperation when using a @classmethod.

It should be possible to demonstrate the differences in a pure Python
prototype. 

Ronald

> 
> Cheers,
> Nick.
> 
> -- 
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
> ___
> Python-Dev mailing list
> Python-Dev@python.org
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/ronaldoussoren%40mac.com

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Benchmark Results for Python Default 2015-07-24

2015-07-24 Thread Victor Stinner
Hi,

I don't know if it's related but at EuroPython at saw a new website
which can also help:

http://pybenchmarks.org/

Victor
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PyCapsule_Import semantics, relative imports, module names etc.

2015-07-24 Thread John Dennis
While porting several existing CPython extension modules that form a 
package to be 2.7 and 3.x compatible the existing PyObject_* API was 
replaced with PyCapsule_*. This introduced some issues the existing 
CPython docs are silent on. I'd like clarification on a few issues and 
wish to raise some questions.


1. Should an extension module name as provided in PyModule_Create (Py3) 
or Py_InitModule3 (Py2) be fully package qualified or just the module 
name? I believe it's just the module name (see item 5 below) Yes/No?


2. PyCapsule_Import does not adhere to the general import semantics. The 
module name must be fully qualified, relative imports are not supported.


3. PyCapsule_Import requires the package (e.g. __init__.py) to import 
*all* of it's submodules which utilize the PyCapsule mechanism 
preventing lazy on demand loading. This is because PyCapsule_Import only 
imports the top level module (e.g. the package). From there it iterates 
over each of the module names in the module path. However the parent 
module (e.g. globals) will not contain an attribute for the submodule 
unless it's already been loaded. If the submodule has not been loaded 
into the parent PyCapsule_Import throws an error instead of trying to 
load the submodule. The only apparent solution is for the package to 
load every possible submodule whether required or not just to avoid a 
loading error. The inability to load modules on demand seems like a 
design flaw and change in semantics from the prior use of 
PyImport_ImportModule in combination with PyObject. [One of the nice 
features with normal import loading is setting the submodule name in the 
parent, the fact this step is omitted is what causes PyCapsule_Import to 
fail unless all submodules are unconditionally loaded). Shouldn't 
PyCapsule_Import utilize PyImport_ImportModule?


4. Relative imports seem much more useful for cooperating submodules in 
a package as opposed to fully qualified package names. Being able to 
import a C_API from the current package (the package I'm a member of) 
seems much more elegant and robust for cooperating modules but this 
semantic isn't supported (in fact the leading dot syntax completely 
confuses PyCapsule_Import, doc should clarify this).


5. The requirement that a module specifies it's name as unqualified when 
it is initializing but then also has to use a fully qualified package 
name for PyCapsule_New, both of which occur inside the same 
initialization function seems like an odd inconsistency (documentation 
clarification would help here). Also, depending on your point of view 
package names could be considered a deployment/packaging decision, a 
module obtains it's fully qualified name by virtue of it's position in 
the filesystem, something at compile time the module will not be aware 
of, another reason why relative imports make sense. Note the identical 
comment regarding _Py_PackageContext in  modsupport.c (Py2) and 
moduleobject.c (Py3) regarding how a module obtains it's fully qualified 
package name (see item 1).


Thanks!

--
John
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status on PEP-431 Timezones

2015-07-24 Thread ISAAC J SCHWABACHER
Well, I was going to stay silent, but math is something I can do without 
wasting anyone's time or embarrassing myself. I don't think this mail answers 
Lennart's concerns, but I do want to get it out there to compete with the 
comment in `datetime.py`. I apologize if the LaTeX density is too high; I don't 
trust that my e-mail client would transmit the message faithfully were I to 
render it myself.

I disagree with the view Tim had of time zones when he wrote that comment (and 
that code). It sounds like he views US/Eastern and US/Central as time zones 
(which they are), but thinks of the various America/Indiana zones as switching 
back and forth between them, rather than being time zones in their own right. I 
think the right perspective is that a time zone *is* the function that its 
`fromutc()` method implements, although of course we need additional 
information in order to actually compute (rather than merely mathematically  
define) its inverse. Daylight Saving Time is a red herring, and assumptions 2 
and 4 in that exposition are just wrong from this point of view. In the worst 
case, Asia/Riyadh's two years of solar time completely shatter these 
assumptions.

I'm convinced that the right viewpoint on this is to view local time and UTC 
time each as isomorphic to $\RR$ (i.e., effectively as UNIX timestamps, minus 
the oft-violated guarantee that timestamps are in UTC), and to consider the 
time zone as
    \[ fromutc : \RR \to \RR. \]
(Leap seconds are a headache for this perspective, but it can still support 
them with well-placed epicycles.) Then our assumptions (inspired by zoneinfo) 
about the nature of this map are as follows:

* $fromutc$ is piecewise defined, with each piece being continuous and strictly 
monotonic increasing.  Let us call the set of discontinuities $\{ utc_i \in \RR 
| i \in \ZZ \}$, where the labels are in increasing order, and define 
$fromutc_i$ to be the $i$-th  piece. (The theoretical treatment doesn't suffer 
if there are only finitely many discontinuities, since we can place additional 
piece boundaries at will where no discontinuities exist; obviously, an 
implementation would not take this view.) 
* The piece $fromutc_i : [utc_i, utc_{i+1}) \to [local_{start, i}, local_{end, 
i})$ and its inverse, which we will call $fromlocal_i$, are both readily 
computable.  In particular, this means that $local_{start, i} = fromutc(utc_i)$ 
and $local_{end, i}$ is the limit of $fromutc(t)$ as $t$ approaches $utc_{i+1}$ 
from the left, and that these values are known.  Note that the (tzfile(5))[1] 
format and (zic(8)[2]) both assume that $fromutc_i$ is of the form $t \mapsto t 
+ off_i$, where $off_i$ is a constant.  This assumption is true in practice, 
but is stronger than we actually need.
* The sequences $\{ local_{start, i} | i \in \ZZ \}$ and $\{ local_{end, i} | i 
\in \ZZ \}$ are strictly increasing, and $local_{end, i-1} < local_{start, 
i+1}$ for all $i \in \ZZ$.  This final condition is enough to guarantee that 
the preimage of any local time under $fromutc$ contains at most two UTC times.  
This assumption would be violated if, for example, some jurisdiction decided to 
fall back two hours by falling back one hour and then immediately falling back 
a second hour.  I recommend the overthrow of any such jurisdiction and its 
(annexation by the Netherlands)[3].

Without the third assumption, it's impossible to specify a UTC time by a (local 
time, time zone, DST flag) triple since there may be more than two UTC times 
corresponding to the same local time, and computing $fromlocal$ becomes more 
complicated, but the problem can still be solved by replacing the DST flag by 
an index into the preimage.  (Lennart, I think this third assumption is the 
important part of your "no changes within 48 hours of each other" assumption, 
which is violated by Asia/Riyadh. Is it enough?)

Once we take this view, computing $fromutc(t)$ is trivial: find $i$ with $utc_i 
\le t < utc_{i+1}$ by binary search (presumably optimized to an $O(1)$ average 
case by using a good initial guess), and compute $fromutc_i(t)$.

Computing $fromlocal(t)$ is somewhat more difficult.  The first thing to 
address is that, as written, $fromlocal$ is not a function; in order to make it 
one, we need to pass it more information.  We could define $fromlocal(t, i) = 
fromlocal_i(t)$, but that's too circular to be useful.  Likewise with my 
(silly) earlier proposal to store $(local, offset)$ pairs-- then $fromlocal(t, 
off) = t - off$.  What we really need is a (partial, which is better than 
multi-valued!) function $fromlocal : \RR \times \{True, False\} \to \RR$ that 
takes a local time and a DST flag and returns a UTC time.  We define 
$fromlocal(local, flag)$ to be the first $utc \in \RR$ such that $fromutc(utc) 
= local$ when $flag$ is $True$ and the last such $utc$ when $flag$ is $False$.  
(Our implementation will presumably also allow $flag$ to be $None$, in which 
case we require $utc$ to be unique.)

Re: [Python-Dev] Status on PEP-431 Timezones

2015-07-24 Thread Mark Lawrence

On 25/07/2015 00:06, ISAAC J SCHWABACHER wrote:


I got to "Daylight Saving Time is a red herring," and stopped reading.

--
My fellow Pythonistas, ask not what our language can do for you, ask
what you can do for our language.

Mark Lawrence

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status on PEP-431 Timezones

2015-07-24 Thread Tim Peters
[ISAAC J SCHWABACHER ]
> ...
> I disagree with the view Tim had of time zones when he wrote that comment
> (and that code). It sounds like he views US/Eastern and US/Central as time
> zones (which they are), but thinks of the various America/Indiana zones as
> switching back and forth between them, rather than being time zones in their
> own right

You can think of them anyway you like.  The point of the code was to
provide a simple & efficient way to convert from UTC to local time in
all "time zones" in known actual use at the time; the point of the
comment was to explain the limitations of the code.  Although, as
Allexander noted, the stated assumptions are stronger than needed.

> I think the right perspective is that a time zone *is* the function that its
> `fromutc()` method implements,

Fine by me ;-)

> although of course we need additional information in order to actually
> compute (rather than merely mathematically define) its inverse. Daylight 
> Saving
> Time is a red herring,

Overstated.  DST is in fact the _only_ real complication in 99.99% of
time zones (perhaps even 99.9913% ;-) ).  As the docs say, if you have
some crazy-ass time zone in mind, fine, that's why fromutc() was
exposed (so your; crazy-ass tzinfo class can override it).

> and assumptions 2 and 4

Nitpick:  4 is a consequence of 2, not an independent assumption.

> in that exposition are just wrong from this point of view.

As above, there is no particular POV in this code:  just a specific
fromutc() implementation, comments that explain its limitations, and
an invitation in the docs to override it if it's not enough for your
case.

> In the worst case, Asia/Riyadh's two years of solar time completely shatter
> these assumptions.

Sure.  But, honestly, who cares?  Riyadh Solar Time was so
off-the-wall that even the Saudis gave up on it 25 years ago (after a
miserable 3-year experiment with it).  "Practicality beats purity".

> [eliding a more-general view of what time zones "really" are]

I'm not eliding it because I disagree with it, but because time zones
are political constructions.  "The math" we make up may or may not be
good enough to deal with all future political abominations; for
example:

> ...
> This assumption would be violated if, for example, some jurisdiction
> decided to fall back two hours by falling back one hour and then
> immediately falling back a second hour.  I recommend the overthrow
> of any such jurisdiction and its (annexation by the Netherlands)[3].

That's not objectively any more bizarre than Riyadh Solar Time.
Although, if I've lived longer than you, I may be more wary about the
creative stupidity of political schemes ;-)


> ... (Lennart, I think this third assumption is the important part of your "no
> changes within 48 hours of each other" assumption,

The "48 hours" bit came from Alexander.  I'm personally unclear on
what Lennart's problems are.

> ...
> All of these computations can be accomplished by searches of ordered lists
> and applications of $fromlocal_i$.

Do you have real-world use cases in mind beyond supporting
long-abandoned Riyadh Solar time?

> ...
> With this perspective, arithmetic becomes "translate to UTC, operate, 
> translate
> back", which is as it should be.

There _was_ a POV in the datetime design about that:  no, that's not
how it should be.  Blame Guido ;-)  If I add, say, 24 hours to noon
today, I want to get noon tomorrow, and couldn't care less whether DST
started or stopped (or any other political adjustment was made) in
between.  For that reason, it was wholly intentional that datetime +
timedelta treats datetime as "naive".  If that's not what someone
wants, fine, but then they don't want Python's datetime arithmetic
BTW, there's no implication that they're "wrong" for wanting something
different; what would be wrong is insisting that datetime's POV is
"wrong".  Both views are valid and useful, depending on the needs of
the application.  One had to picked as the built-in behavior, and
"naive" won.

> ...
> But IIUC what Lennart is complaining about

I don't, and I wish he would be more explicit about what "the
problem(s)" is(are).

> is the fact that the DST flag isn't part of and can't be embedded into a 
> local time,
> so it's impossible to fold the second parameter to $fromlocal$ into $t$.  
> Without
> that, a local time isn't rich enough to designate a single point in time and 
> the
> whole edifice breaks.

You can blame Guido for that too ;-) , but in this case I disagree(d)
with him:  Guido was overly (IMO) annoyed by that the only apparent
purpose for a struct tm's tm_ isdst flag was to disambiguate local
times in a relative handful of cases.  His thought:  an entire bit
just for that?!  My thought:  get over it, it's one measly bit.

my-kingdom-for-bit-ingly y'rs  - tim
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org

Re: [Python-Dev] Status on PEP-431 Timezones

2015-07-24 Thread Alexander Belopolsky
On Fri, Jul 24, 2015 at 9:39 PM, Tim Peters  wrote:

> > But IIUC what Lennart is complaining about
>
> I don't, and I wish he would be more explicit about what "the
> problem(s)" is(are).
>
> > is the fact that the DST flag isn't part of and can't be embedded into a
> local time,
> > so it's impossible to fold the second parameter to $fromlocal$ into
> $t$.  Without
> > that, a local time isn't rich enough to designate a single point in time
> and the
> > whole edifice breaks.
>
> You can blame Guido for that too ;-) , but in this case I disagree(d)
> with him:  Guido was overly (IMO) annoyed by that the only apparent
> purpose for a struct tm's tm_ isdst flag was to disambiguate local
> times in a relative handful of cases.  His thought:  an entire bit
> just for that?!  My thought:  get over it, it's one measly bit.


IIUC, Lennart came to (a wrong IMHO) conclusion that one bit is not enough
and you
must either keep datetime in UTC or store the UTC offset with datetime.

My position is that one bit is enough to disambiguate local time in all
sane situations,
but the name "isdst" is misleading because discontinuities in UTC to Local
function
(from now on called L(t)) may be due to causes other than DST transitions.

The math here is very simple: there are two kinds of discontinuities: you
either move the
local clock forward by a certain amount or you move it back.  Let's call
these (unimaginatively)
discontinuities of the first and second kind.

When you have a discontinuity of the first kind, you have a range of values
u for which
the equation u = L(t) has no solution for t.  However, if we linearly
extrapolate L(t) from  before
the discontinuity forward, we get a linear function Lb(t) and we can solve
u = Lb(t) for any
value of u.  The problem, however is that we can also extend L(t) linearly
from the time
after the discontinuity to all times and get another function La(t) which
will also allow you to
solve equation u = La(t) for all times.  Without user input, there is no
way to tell which solution
she expects.  This is the 1-bit of information that we need.

The situation with the discontinuity of the second kind is similar, but
even simpler.  Here,
u = L(t) has two solutions and we need 1-bit of information to disambiguate.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status on PEP-431 Timezones

2015-07-24 Thread Lennart Regebro
And I would want to remind everyone again that this is not a question
of the problem being impossible. It's just really complex to get right
in all cases, and that always having the UTC timestamp around gets rid
of most of that complexity.

On Sat, Jul 25, 2015 at 3:39 AM, Tim Peters  wrote:
> [ISAAC J SCHWABACHER ]
>> ...
>> I disagree with the view Tim had of time zones when he wrote that comment
>> (and that code). It sounds like he views US/Eastern and US/Central as time
>> zones (which they are), but thinks of the various America/Indiana zones as
>> switching back and forth between them, rather than being time zones in their
>> own right
>
> You can think of them anyway you like.  The point of the code was to
> provide a simple & efficient way to convert from UTC to local time in
> all "time zones" in known actual use at the time; the point of the
> comment was to explain the limitations of the code.  Although, as
> Allexander noted, the stated assumptions are stronger than needed.
>
>> I think the right perspective is that a time zone *is* the function that its
>> `fromutc()` method implements,
>
> Fine by me ;-)
>
>> although of course we need additional information in order to actually
>> compute (rather than merely mathematically define) its inverse. Daylight 
>> Saving
>> Time is a red herring,
>
> Overstated.  DST is in fact the _only_ real complication in 99.99% of
> time zones (perhaps even 99.9913% ;-) ).  As the docs say, if you have
> some crazy-ass time zone in mind, fine, that's why fromutc() was
> exposed (so your; crazy-ass tzinfo class can override it).
>
>> and assumptions 2 and 4
>
> Nitpick:  4 is a consequence of 2, not an independent assumption.
>
>> in that exposition are just wrong from this point of view.
>
> As above, there is no particular POV in this code:  just a specific
> fromutc() implementation, comments that explain its limitations, and
> an invitation in the docs to override it if it's not enough for your
> case.
>
>> In the worst case, Asia/Riyadh's two years of solar time completely shatter
>> these assumptions.
>
> Sure.  But, honestly, who cares?  Riyadh Solar Time was so
> off-the-wall that even the Saudis gave up on it 25 years ago (after a
> miserable 3-year experiment with it).  "Practicality beats purity".
>
>> [eliding a more-general view of what time zones "really" are]
>
> I'm not eliding it because I disagree with it, but because time zones
> are political constructions.  "The math" we make up may or may not be
> good enough to deal with all future political abominations; for
> example:
>
>> ...
>> This assumption would be violated if, for example, some jurisdiction
>> decided to fall back two hours by falling back one hour and then
>> immediately falling back a second hour.  I recommend the overthrow
>> of any such jurisdiction and its (annexation by the Netherlands)[3].
>
> That's not objectively any more bizarre than Riyadh Solar Time.
> Although, if I've lived longer than you, I may be more wary about the
> creative stupidity of political schemes ;-)
>
>
>> ... (Lennart, I think this third assumption is the important part of your "no
>> changes within 48 hours of each other" assumption,
>
> The "48 hours" bit came from Alexander.  I'm personally unclear on
> what Lennart's problems are.
>
>> ...
>> All of these computations can be accomplished by searches of ordered lists
>> and applications of $fromlocal_i$.
>
> Do you have real-world use cases in mind beyond supporting
> long-abandoned Riyadh Solar time?
>
>> ...
>> With this perspective, arithmetic becomes "translate to UTC, operate, 
>> translate
>> back", which is as it should be.
>
> There _was_ a POV in the datetime design about that:  no, that's not
> how it should be.  Blame Guido ;-)  If I add, say, 24 hours to noon
> today, I want to get noon tomorrow, and couldn't care less whether DST
> started or stopped (or any other political adjustment was made) in
> between.  For that reason, it was wholly intentional that datetime +
> timedelta treats datetime as "naive".  If that's not what someone
> wants, fine, but then they don't want Python's datetime arithmetic
> BTW, there's no implication that they're "wrong" for wanting something
> different; what would be wrong is insisting that datetime's POV is
> "wrong".  Both views are valid and useful, depending on the needs of
> the application.  One had to picked as the built-in behavior, and
> "naive" won.
>
>> ...
>> But IIUC what Lennart is complaining about
>
> I don't, and I wish he would be more explicit about what "the
> problem(s)" is(are).
>
>> is the fact that the DST flag isn't part of and can't be embedded into a 
>> local time,
>> so it's impossible to fold the second parameter to $fromlocal$ into $t$.  
>> Without
>> that, a local time isn't rich enough to designate a single point in time and 
>> the
>> whole edifice breaks.
>
> You can blame Guido for that too ;-) , but in this case I disagree(d)
> with him:  Guido was overly (IMO) an

Re: [Python-Dev] Status on PEP-431 Timezones

2015-07-24 Thread Tim Peters
[Tim]
> Sure.  But, honestly, who cares?  Riyadh Solar Time was so
> off-the-wall that even the Saudis gave up on it 25 years ago (after a
> miserable 3-year experiment with it).  "Practicality beats purity".

Heh.  It's even sillier than that - the Saudis never used "Riyadh
Solar Time", and it's been removed from release 2015e of the tz
database:

https://www.ietf.org/timezones/data/NEWS
Release 2015e - 2015-06-13 10:56:02 -0700
...
The files solar87, solar88, and solar89 are no longer distributed.
They were a negative experiment - that is, a demonstration that
tz data can represent solar time only with some difficulty and error.
Their presence in the distribution caused confusion, as Riyadh
civil time was generally not solar time in those years.

Looking back, Paul Eggert explained more in 2013, but it took this
long for the patch to land:

http://comments.gmane.org/gmane.comp.time.tz/7717
> did Saudi Arabia really use this as clock time?

Not as far as I know, for civil time.  There was some use
for religious purposes but it didn't use the approximation
in those files.

These files probably cause more confusion than they're worth,
so I'll propose a couple of patches to remove them, in two followup
emails.  I haven't pushed these patches to the experimental
github version.

The position of the sun is vital to establishing prayer times in
Islam, but that's got little to do with civil time in Islamic
countries.  And Olson didn't take his "Riyadh Solar Time" rules from
the Saudis, he made up the times himself:  "Times were computed using
formulas in the U.S. Naval Observatory's Almanac for Computers
1987[89]".  The formulas only produced approximations, and then
rounded to 5-second boundaries because the tz data format didn't have
enough bits.

So, as a motivating example, it's hard to get less compelling:  Riyadh
Solar is a wholly artificial "time zone" made up by a time zone wonk
to demonstrate some limitations of the tz database he maintained.
Although I expect he could have done so just as effectively by writing
a brief note about it ;-)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status on PEP-431 Timezones

2015-07-24 Thread Tim Peters
[Lennart Regebro ]
> And I would want to remind everyone again that this is not a question
> of the problem being impossible. It's just really complex to get right
> in all cases, and that always having the UTC timestamp around gets rid
> of most of that complexity.

Could you please be explicit about what "the problem" is?  Everyone
here is guessing at what you think "the problem" is.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status on PEP-431 Timezones

2015-07-24 Thread Tim Peters
[Tim]
>> The formulas only produced approximations, and then
>> rounded to 5-second boundaries because the tz data format didn't have
>> enough bits.

[ISAAC J SCHWABACHER ]
> Little known fact: if you have a sub-minute-resolution UTC offset when a
> leap second hits, it rips open a hole in the space-time continuum and
> you find yourself in New Netherlands.

Tell me about it!  Last time that happened I had to grow stinking
tulips for 3 years to get enough money to sail back home.  I'll never
use a sub-minute-resolution UTC offset again ;-)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status on PEP-431 Timezones

2015-07-24 Thread ISAAC J SCHWABACHER
> From: Tim Peters 
> Sent: Saturday, July 25, 2015 00:14
> To: ISAAC J SCHWABACHER
> Cc: Alexander Belopolsky; Lennart Regebro; Python-Dev
> Subject: Re: [Python-Dev] Status on PEP-431 Timezones

> [Tim]
> >> The formulas only produced approximations, and then
> >> rounded to 5-second boundaries because the tz data format didn't have
> >> enough bits.

> [ISAAC J SCHWABACHER ]
> > Little known fact: if you have a sub-minute-resolution UTC offset when a
> > leap second hits, it rips open a hole in the space-time continuum and
> > you find yourself in New Netherlands.

> Tell me about it!  Last time that happened I had to grow stinking
> tulips for 3 years to get enough money to sail back home.  I'll never
> use a sub-minute-resolution UTC offset again ;-)

I meant this one: https://what-if.xkcd.com/54/ :)

ijs
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status on PEP-431 Timezones

2015-07-24 Thread ISAAC J SCHWABACHER
> The formulas only produced approximations, and then
> rounded to 5-second boundaries because the tz data format didn't have
> enough bits.

Little known fact: if you have a sub-minute-resolution UTC offset when a leap 
second hits, it rips open a hole in the space-time continuum and you find 
yourself in New Netherlands.

ijs


From: Tim Peters 
Sent: Saturday, July 25, 2015 00:07
To: ISAAC J SCHWABACHER
Cc: Alexander Belopolsky; Lennart Regebro; Python-Dev
Subject: Re: [Python-Dev] Status on PEP-431 Timezones

[Tim]
> Sure.  But, honestly, who cares?  Riyadh Solar Time was so
> off-the-wall that even the Saudis gave up on it 25 years ago (after a
> miserable 3-year experiment with it).  "Practicality beats purity".

Heh.  It's even sillier than that - the Saudis never used "Riyadh
Solar Time", and it's been removed from release 2015e of the tz
database:

https://www.ietf.org/timezones/data/NEWS
Release 2015e - 2015-06-13 10:56:02 -0700
...
The files solar87, solar88, and solar89 are no longer distributed.
They were a negative experiment - that is, a demonstration that
tz data can represent solar time only with some difficulty and error.
Their presence in the distribution caused confusion, as Riyadh
civil time was generally not solar time in those years.

Looking back, Paul Eggert explained more in 2013, but it took this
long for the patch to land:

http://comments.gmane.org/gmane.comp.time.tz/7717
> did Saudi Arabia really use this as clock time?

Not as far as I know, for civil time.  There was some use
for religious purposes but it didn't use the approximation
in those files.

These files probably cause more confusion than they're worth,
so I'll propose a couple of patches to remove them, in two followup
emails.  I haven't pushed these patches to the experimental
github version.

The position of the sun is vital to establishing prayer times in
Islam, but that's got little to do with civil time in Islamic
countries.  And Olson didn't take his "Riyadh Solar Time" rules from
the Saudis, he made up the times himself:  "Times were computed using
formulas in the U.S. Naval Observatory's Almanac for Computers
1987[89]".  The formulas only produced approximations, and then
rounded to 5-second boundaries because the tz data format didn't have
enough bits.

So, as a motivating example, it's hard to get less compelling:  Riyadh
Solar is a wholly artificial "time zone" made up by a time zone wonk
to demonstrate some limitations of the tz database he maintained.
Although I expect he could have done so just as effectively by writing
a brief note about it ;-)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status on PEP-431 Timezones

2015-07-24 Thread Lennart Regebro
On Sat, Jul 25, 2015 at 7:12 AM, Tim Peters  wrote:
> [Lennart Regebro ]
>> And I would want to remind everyone again that this is not a question
>> of the problem being impossible. It's just really complex to get right
>> in all cases, and that always having the UTC timestamp around gets rid
>> of most of that complexity.
>
> Could you please be explicit about what "the problem" is?  Everyone
> here is guessing at what you think "the problem" is.

The problem is that it is exceedingly complicated to get all the
calculations back and forth between local time and UTC to be correct
at all times and for all cases. It really doesn't get more specific
than that. I don't remember which exact problem it was that made me
decide that this was not the correct solution and that we should use
UTC internally, but I don't think that matters, because I'm also sure
that it was not the last case, as I was far from near the end in
adding testcases.

Once again I'm sure it's not impossible to somehow come up with an
implementation and an API that can do this based on local time, but
once again I am of the opinion that it is the wrong thing to do. We
should switch to using UTC internally, because that will make
everything so much simpler.

I am in no way against other people implementing this PEP, but I think
you will end up with very complex code that will be hard to maintain.

There really is a reason every other date time implementation I know
of uses UTC internally, and there really is a reason why everyone
always recommends storing date times in UTC with the time zone or
offset separately.

//Lennart
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com