Dennis Sweeney added the comment:
Since this isn't quite related to the original issue, I opened bpo-45806 to
discuss.
--
___
Python tracker
<https://bugs.python.org/is
Dennis Sweeney added the comment:
I got a segfault in a similar location:
static PyObject *
offer_suggestions_for_name_error(PyNameErrorObject *exc)
{
PyObject *name = exc->name; // borrowed reference
PyTracebackObject *traceback = (PyTracebackObject *) exc->traceback; //
bo
Dennis Sweeney added the comment:
Here's shorter reproducer not involving unittest:
import sys
try:
aab
except:
exc_type, exc_value, tb = sys.exc_info()
exc_value.with_traceback(None)
raise ZeroDivisionError()
--
___
P
Dennis Sweeney added the comment:
Even shorter reproducer:
-
try:
aab
except BaseException as E:
E.with_traceback(None)
raise ZeroDivisionError()
-
Bisection points to the initial implementation of suggestions.c
Change by Dennis Sweeney :
--
keywords: +patch
pull_requests: +27833
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/29590
___
Python tracker
<https://bugs.python.org/issu
Dennis Sweeney added the comment:
https://github.com/python/cpython/pull/29605 was just opened as a backport
--
nosy: +Dennis Sweeney
___
Python tracker
<https://bugs.python.org/issue45
Dennis Sweeney added the comment:
This snippet occurs a couple of times in ceval.c (BINARY_SUBSCR_GETITEM and
CALL_FUNCTION_PY_SIMPLE):
new_frame->previous = frame;
frame = cframe.current_frame = new_frame;
new_frame->depth = frame->depth + 1;
Dennis Sweeney added the comment:
I got a crash on Windows in Objects/genobject.c:
void
_PyGen_Finalize(PyObject *self)
{
PyGenObject *gen = (PyGenObject *)self;
PyObject *res = NULL;
PyObject *error_type, *error_value, *error_traceback;
if (gen->gi_xframe == N
Dennis Sweeney added the comment:
I think the import is irrelevant (luckily). This still crashes:
async def f():
pass
frame = f().cr_frame
frame.clear()
--
___
Python tracker
<https://bugs.python.org/issue45
Dennis Sweeney added the comment:
Even without garbage-collecting the coroutine, we get a failed assertion in
debug mode (but no crash with the assertion removed):
Python 3.11.0a2+ (heads/main:c8c21bdd19, Nov 21 2021, 13:58:01) [MSC v.1929 64
bit (AMD64)] on win32
Type "help",
New submission from Dennis Sweeney :
Some specialization statistics:
https://gist.github.com/sweeneyde/49cc3a9d074d56cf095cb0a42d13d7a4
Including 3 opcodes: COMPARE_OP_INT and COMPARE_OP_FLOAT and COMPARE_OP_STR
(equality only) seems to give pretty good specialization numbers, better than
Change by Dennis Sweeney :
--
keywords: +patch
pull_requests: +27972
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/29734
___
Python tracker
<https://bugs.python.org/issu
Dennis Sweeney added the comment:
I think the PR fixed one case, but the other case (when coro is kept around)
still fails an assertion:
Python 3.11.0a2+ (heads/main:734ed35383, Nov 29 2021, 19:29:25) [MSC v.1929 64
bit (AMD64)] on win32
Type "help", "copyright", "
Dennis Sweeney added the comment:
This is consistent with the docstrings of the methods:
-
>>> help(Counter.__iadd__)
Help on function __iadd__ in module collections:
__iadd__(self, other)
Inplace add fro
Change by Dennis Sweeney :
--
resolution: -> fixed
stage: patch review -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
New submission from Dennis Sweeney :
I'd love to know if there is something I'm doing wrong, but with recent
deepfreeze changes, my Visual Studio 2019 setup has been having trouble build
things. It seems PCBuild/build.bat works perfectly and all tests pass and
everything, but usi
Dennis Sweeney added the comment:
I think I've been getting similar issues for a little while that I've been able
to work around until now. If I git checkout 3.10 and then hit the "Local
Windows Debugger" button, I get this:
Build started...
1>-- Build start
Dennis Sweeney added the comment:
Sure enough, if I go to "Choose Default Apps by File Type" in Windows settings
and associate ".py" --> "Notepad", then Visual studio keeps opening Notepad
windows during build.
Can the Visual Studio build configurations
Dennis Sweeney added the comment:
Actually, it seems $(PythonForBuild) is already used everywhere, it's just that
$(PythonForBuild) is giving the empty string when VS uses it.
On the other hand, PCBuild/build.bat sets $(PythonForBuild) by calling out to
find_python.bat, and that works
Dennis Sweeney added the comment:
bpo-46009 about the same behavior change
--
___
Python tracker
<https://bugs.python.org/issue43683>
___
___
Python-bugs-list m
Dennis Sweeney added the comment:
I believe https://bugs.python.org/issue44376 added a special case for 2nd and
3rd powers, and that's the 3.10/3.11 difference in the speed of x**2, not ceval
optimizations.
--
nosy: +Dennis Sweeney
___
P
New submission from Dennis Sweeney :
Although the implementation of the heapq.merge function uses an underlying heap
structure, its behavior centers on iterators. For this reason, I believe there
should either be an alias to this function in the itertools module or at least
a recipe in the
Dennis Sweeney added the comment:
The following seems like it is a short, readable recipe for itertools.
--
Added file: https://bugs.python.org/file48748/merge_recipe.py
___
Python tracker
<https://bugs.python.org/issue38
Dennis Sweeney added the comment:
Disregard merge_recipe.py: it would skip over a value that had already been
retrieved from the iterator when the loop finished.
--
___
Python tracker
<https://bugs.python.org/issue38
Change by Dennis Sweeney :
--
keywords: +patch
pull_requests: +16903
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/17422
___
Python tracker
<https://bugs.python.org/issu
Change by Dennis Sweeney :
--
pull_requests: +17174
pull_request: https://github.com/python/cpython/pull/17729
___
Python tracker
<https://bugs.python.org/issue38
Dennis Sweeney added the comment:
PR 17729 is a C implementation of a non-recursive "flattening" of the the
recursive-lazy-mergesort algorithm into a tournament whose state is a tree of
losers of comparisons.
--
___
Python track
New submission from Dennis Sweeney :
Similar to https://bugs.python.org/issue39453, but with deques:
Python 3.9.0a3+:
>>> from collections import deque
>>> class A:
... def __eq__(self, other):
... L.clear()
... return NotImplemented
...
>>> L =
Change by Dennis Sweeney :
--
keywords: +patch
pull_requests: +17796
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/18421
___
Python tracker
<https://bugs.python.org/issu
Dennis Sweeney added the comment:
Should there be a similar generic test case in test.seq_test?
--
___
Python tracker
<https://bugs.python.org/issue39
Change by Dennis Sweeney :
--
pull_requests: +17803
pull_request: https://github.com/python/cpython/pull/18427
___
Python tracker
<https://bugs.python.org/issue38
Dennis Sweeney added the comment:
I think the behavior of gcd() == 0 is correct, but it should be documented,
because it isn't completely obvious.
Arguments for gcd() == 0:
- Preserves the invariant gcd(itertools.chain(iterables)) ==
gcd(itertools.starmap(gcd, iterables)) in the case
Dennis Sweeney added the comment:
Correction: gcd(itertools.chain(iterables)) == gcd(*map(gcd, iterables))
--
___
Python tracker
<https://bugs.python.org/issue39
Dennis Sweeney added the comment:
Correction correction: returning zero preserves the invariant below.
from math import gcd as GCD
from functools import reduce
from itertools import starmap, chain
def gcd(*args):
return reduce(GCD, args, 0)
iterables = [[10, 20, 30
New submission from Dennis Sweeney :
Should something like the following go in the standard library, most likely in
the math module? I know I had to use such a thing before pow(a, -1, b) worked,
but Bezout is more general. And many of the easy stackoverflow implementations
of CRT congruence
New submission from Dennis Sweeney :
The following tiny change:
diff --git a/Objects/listobject.c b/Objects/listobject.c
index 3c39c6444b..3ac03b71d0 100644
--- a/Objects/listobject.c
+++ b/Objects/listobject.c
@@ -2643,8 +2643,7 @@ list_richcompare(PyObject *v, PyObject *w, int op
Change by Dennis Sweeney :
--
keywords: +patch
pull_requests: +18004
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/18638
___
Python tracker
<https://bugs.python.org/issu
Dennis Sweeney added the comment:
I made the requested changes to reflect that this is for code cleanliness
rather than strictly for performance.
However, it appears that Visual Studio on Windows 10 was not doing the
optimization one might expect. In particular, here is the disassembly
Dennis Sweeney added the comment:
> Hmm, Is this build on release mode?
Yes--in debug mode, each Py_INCREF is these 8 instructions:
78BD071E mov ecx,dword ptr [_Py_RefTotal (79039700h)]
78BD0724 add ecx,1
78BD0727 mov dword
Dennis Sweeney added the comment:
> Debug mode is not meaningful.
> Visual Studio will optimize fully on release mode.
Sorry if I wasn't clear--the original assembly difference I posted in
(https://bugs.python.org/msg362665) was indeed using the "release" build
c
New submission from Dennis Sweeney :
Following discussion here (
https://mail.python.org/archives/list/python-id...@python.org/thread/RJARZSUKCXRJIP42Z2YBBAEN5XA7KEC3/
), there is a proposal to add new methods str.cutprefix and str.cutsuffix to
alleviate the common misuse of str.lstrip and
Change by Dennis Sweeney :
--
keywords: +patch
pull_requests: +18292
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/18939
___
Python tracker
<https://bugs.python.org/issu
New submission from Dennis Sweeney :
It seems that `.join` methods typically return the type of the separator on
which they are called:
>>> bytearray(b" ").join([b"a", b"b"])
bytearray(b'a b')
>>> b" ".jo
Change by Dennis Sweeney :
--
pull_requests: +18304
pull_request: https://github.com/python/cpython/pull/18953
___
Python tracker
<https://bugs.python.org/issue39
Dennis Sweeney added the comment:
This is not a duplicate: issue16397 concerned
" ".join([US("a"), US("b")])
While this is concerned about the return value and acceptable parameters for
UserString.join().
--
Change by Dennis Sweeney :
--
resolution: duplicate ->
status: closed -> open
___
Python tracker
<https://bugs.python.org/issue39944>
___
___
Python-bugs-
Change by Dennis Sweeney :
--
resolution: -> wont fix
stage: patch review -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
Dennis Sweeney added the comment:
Yes:
>>> x = "A"*10**6
>>> x.cutprefix("B") is x
True
>>> x.cutprefix("") is x
True
>>> y = b"A"*10**6
>>> y.cutprefix(b"B
Dennis Sweeney added the comment:
First, as I posted at
https://github.com/python/cpython/pull/17729#issuecomment-571864662, there is a
theoretical advantage of fewer comparisons in all cases, and the new algorithm
would be especially dominant when one iterable keeps winning. (I'm giv
Dennis Sweeney added the comment:
The existing Python implementation is benefiting from the C accelerators for
heapify and heapreplace. When forcing pure python using test.support, I get
these results:
.\python.bat -m pyperf timeit -s "from random import random; from collections
i
Dennis Sweeney added the comment:
My suspicion was confirmed about PyPy (My PyPy here is Python 3.6.1
(784b254d6699, Apr 16 2019, 12:10:48) [PyPy 7.1.1-beta0 with MSC v.1910 32 bit]
on win32). In what follows, "heapq2.py" had exactly the `class merge` Python
implementation fro
Dennis Sweeney added the comment:
If no one has started, I can draft such a PEP.
--
___
Python tracker
<https://bugs.python.org/issue39939>
___
___
Python-bug
Dennis Sweeney added the comment:
Here is a draft PEP -- I believe it needs a Core Developer sponsor now?
--
Added file: https://bugs.python.org/file48983/pep-.rst
___
Python tracker
<https://bugs.python.org/issue39
Change by Dennis Sweeney :
Added file: https://bugs.python.org/file48989/pep-.rst
___
Python tracker
<https://bugs.python.org/issue39939>
___
___
Python-bugs-list m
Change by Dennis Sweeney :
Removed file: https://bugs.python.org/file48983/pep-.rst
___
Python tracker
<https://bugs.python.org/issue39939>
___
___
Python-bugs-list m
Dennis Sweeney added the comment:
https://github.com/python/peps/pull/1332
--
___
Python tracker
<https://bugs.python.org/issue39939>
___
___
Python-bugs-list m
Dennis Sweeney added the comment:
Just posted it.
--
___
Python tracker
<https://bugs.python.org/issue39939>
___
___
Python-bugs-list mailing list
Unsubscribe:
Dennis Sweeney added the comment:
I think this question is about types in c, apart from any Python c API.
According to https://docs.python.org/3/c-api/arg.html#numbers, the specifier is
c: (bytes or bytearray of length 1) -> [char]
so you should be able to write to a c variable of t
Dennis Sweeney added the comment:
The trouble is that itertools.product accepts iterators, and there is no
guaranteed way of "restarting" an arbitrary iterator in Python. Consider:
>>> a = iter([1,2,3])
>>> b = iter([4,5,6])
>>> next(a)
Change by Dennis Sweeney :
--
versions: +Python 3.9 -Python 3.7
___
Python tracker
<https://bugs.python.org/issue40230>
___
___
Python-bugs-list mailin
New submission from Dennis Sweeney :
I get the following intermittent failure when running the tests on Master on
Windows 10.
=
=
=
PS C:\...\cpython> .\python.bat -m unittest
Dennis Sweeney added the comment:
I disabled indexing and antivirus and I didn't see anything else obvious that
would access the files, but I'm probably missing something -- I get the same
intermittent failure when I build from the source at the 3.8.2 release, but not
on a cop
Change by Dennis Sweeney :
--
resolution: -> works for me
stage: -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
Dennis Sweeney added the comment:
I replicated this behavior. This looks like the relevant loop in pystrhex.c:
for (i=j=0; i < arglen; ++i) {
assert((j + 1) < resultlen);
unsigned char c;
c = (argbuf[i] >> 4) & 0x0f;
retbuf[j++] = Py_hexdigi
Change by Dennis Sweeney :
--
keywords: +patch
pull_requests: +18930
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/19594
___
Python tracker
<https://bugs.python.org/issu
Dennis Sweeney added the comment:
== Master ==
.\python.bat -m pyperf timeit -s "import random, math;
data=random.getrandbits(8*10_000_000).to_bytes(10_000_000, 'big')" "temp =
data.hex(); '\n'.join(temp[n:n+128] for n in range(0, len(temp
Dennis Sweeney added the comment:
I'm personally -0 for underscores -- they might slightly improve readability of
the function name in isolation but may also add confusion about which methods
have underscores. Only one out of the 45 non-dunder str methods has an
underscore righ
Dennis Sweeney added the comment:
Oops -- I now see the message on Python-Dev.
--
___
Python tracker
<https://bugs.python.org/issue39939>
___
___
Python-bug
Dennis Sweeney added the comment:
There's a failure here:
https://buildbot.python.org/all/#/builders/64/builds/656
Failed subtests:
test_killed_child -
test.test_concurrent_futures.ProcessPoolSpawnProcessPoolExecutorTest
Traceback (most recent call
Dennis Sweeney added the comment:
> `Mapping.__reversed__` exists
While ``'__reversed__' in dir(Mapping)`` is true, that unfortunately does not
mean that it is a real callable method:
from collections.abc import Mapping
class Map(Mapping):
def __geti
Dennis Sweeney added the comment:
Thanks for reaching out! This is about test failures, not problems with
installation process, correct? I took a look at the failures:
==
ERROR: test_add_file_after_2107
Change by Dennis Sweeney :
--
components: +Tests -Installation
title: Errors during make test python 3.8.2 -> OS-related test failures on
Linux in Python 3.8.2
type: compile error -> behavior
___
Python tracker
<https://bugs.python.org/i
New submission from Dennis Sweeney :
Since bytes.hex() was added in 3.5, we should be able to make the following
change:
diff --git a/Lib/secrets.py b/Lib/secrets.py
index a546efbdd4..1dd8629f52 100644
--- a/Lib/secrets.py
+++ b/Lib/secrets.py
@@ -13,7 +13,6 @@ __all__
Change by Dennis Sweeney :
--
keywords: +patch
pull_requests: +19070
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/19749
___
Python tracker
<https://bugs.python.org/issu
Change by Dennis Sweeney :
--
title: Small Refactoring: Use the bytes.hex() in secrets.token_hex() -> Small
Refactoring: Use bytes.hex() in secrets.token_hex()
___
Python tracker
<https://bugs.python.org/issu
Dennis Sweeney added the comment:
git bisect says that this was fixed here:
commit b94dbd7ac34dc0c79512656eb17f6f07e09fca7a
Author: Pablo Galindo
Date: Mon Apr 27 18:35:58 2020 +0100
bpo-40334: Support PyPARSE_DONT_IMPLY_DEDENT in the new parser (GH-19736)
--
nosy: +Dennis
Change by Dennis Sweeney :
--
keywords: +patch
nosy: +Dennis Sweeney
nosy_count: 1.0 -> 2.0
pull_requests: +19171
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/19855
___
Python tracker
<https://bugs.p
Dennis Sweeney added the comment:
I can submit a PR. Just making sure I understand, is this essentially the
desired behavior change?
import weakref
import functools
if 0:
from test.support import import_fresh_module
functools = import_fresh_module('functools', blocked=[&
Change by Dennis Sweeney :
--
keywords: +patch
pull_requests: +19253
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/19938
___
Python tracker
<https://bugs.python.org/issu
Dennis Sweeney added the comment:
For some more ideas for features or APIs, you could look at:
https://docs.sympy.org/latest/modules/ntheory.html or
http://doc.sagemath.org/html/en/reference/rings_standard/sage/arith/misc.html
for an absolute upper bound.
If there's to be a minimal n
Dennis Sweeney added the comment:
I think the behavior is consistent between tuple and an empty subclass:
>>> from typing import List
>>> class T(tuple):
pass
== Empty tuple/T ==
>>> List[()]
Traceback (most recent call last):
Change by Dennis Sweeney :
--
nosy: +gvanrossum, levkivskyi
___
Python tracker
<https://bugs.python.org/issue40582>
___
___
Python-bugs-list mailing list
Unsub
Dennis Sweeney added the comment:
As Serhiy suggested, keeping the algorithm but moving the Python implementation
to be a generator again (as I recently changed in PR 18427) gives another
performance boost (although this unrolling is many lines of code).
Timing the C implementation
Dennis Sweeney added the comment:
The attached recursive_merge.py should be much less ugly and still somewhat
performant.
It should be the same algorithm as that PR, just written recursively rather
than iteratively.
I got some text files from http://www.gwicks.net/dictionaries.htm and
Change by Dennis Sweeney :
Removed file: https://bugs.python.org/file48747/iter_merge.py
___
Python tracker
<https://bugs.python.org/issue38938>
___
___
Python-bug
Change by Dennis Sweeney :
Removed file: https://bugs.python.org/file49156/recursive_merge.py
___
Python tracker
<https://bugs.python.org/issue38938>
___
___
Python-bug
Change by Dennis Sweeney :
Removed file: https://bugs.python.org/file48748/merge_recipe.py
___
Python tracker
<https://bugs.python.org/issue38938>
___
___
Python-bug
Change by Dennis Sweeney :
Added file: https://bugs.python.org/file49164/tournament_heap.py
___
Python tracker
<https://bugs.python.org/issue38938>
___
___
Python-bug
Change by Dennis Sweeney :
Added file: https://bugs.python.org/file49165/losers.py
___
Python tracker
<https://bugs.python.org/issue38938>
___
___
Python-bugs-list mailin
Change by Dennis Sweeney :
Added file: https://bugs.python.org/file49166/recursive_merge.py
___
Python tracker
<https://bugs.python.org/issue38938>
___
___
Python-bug
Dennis Sweeney added the comment:
It seems to me that the code sprawl mostly comes from the separate handling of
the four keyed/unkeyed and forward/reverse cases, which as far as I can tell
requires a branch in the innermost loop if not unrolled into separate cases. I
think
Change by Dennis Sweeney :
Removed file: https://bugs.python.org/file49165/losers.py
___
Python tracker
<https://bugs.python.org/issue38938>
___
___
Python-bugs-list m
Change by Dennis Sweeney :
Added file: https://bugs.python.org/file49167/losers.py
___
Python tracker
<https://bugs.python.org/issue38938>
___
___
Python-bugs-list mailin
Dennis Sweeney added the comment:
I mostly like new_merge.py too, especially the dynamic reduction of the tree.
However, it looks like ``list(merge([2],[1],[1]))`` currently fails, and I
think what's missing is the following in the sibling-promotion:
+ if sibling.left i
Dennis Sweeney added the comment:
This might be the expected behavior. See https://bugs.python.org/issue25222
If you already caught a RecursionError and you keep recursing anyway, once you
go 50 levels beyond sys.getrecursionlimit(), the interpreter crashes regardless
of what is `except`ed
Dennis Sweeney added the comment:
sys.getrecursionlimit() returns whatever was passed to the most recent call of
sys.setrecursionlimit(...), with some system default (here 1000).
Catching a RecursionError might be fine sometimes, but the issue is that
Program 1 catches a RecursionError *and
Dennis Sweeney added the comment:
Maybe you're looking for re.fullmatch:
https://docs.python.org/3/library/re.html#re.fullmatch
--
nosy: +Dennis Sweeney
___
Python tracker
<https://bugs.python.org/is
Dennis Sweeney added the comment:
I'll add that for 98% of the use cases of a linked list (where you just want
fast access at the ends), you can use a `collections.deque` instead, and it
will be faster (fewer dereferences) and more memory-efficient (fewer pointers
to store).
I
Dennis Sweeney added the comment:
Why not just fix bogus_code_obj.py?
Something like this (using the replace method) would make it more future-proof
to similar changes in the code object constructor signature (and be more
readable!):
import dis
POP_TOP = dis.opmap['POP_TOP'
Dennis Sweeney added the comment:
For convenience, attached is a quick and dirty Tkinter GUI that lets you step
through the Crochemore/Perrin Algorithm on your choice of inputs, just for
play/discovery.
A good illustration of the memory for periodic needles can be found by testing
301 - 400 of 516 matches
Mail list logo