[issue9870] compile and nested scopes

2010-09-16 Thread Sergey

New submission from Sergey :

See attached tmp1.py
It is very simple but it doesn't work
It raises NameError
NameError: global name 'arg' is not defined

--
components: None
files: tmp1.py
messages: 116515
nosy: webcubator
priority: normal
severity: normal
status: open
title: compile and nested scopes
type: behavior
versions: Python 2.7, Python 3.1
Added file: http://bugs.python.org/file18900/tmp1.py

___
Python tracker 
<http://bugs.python.org/issue9870>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32632] Mock does not create deepcopy of mutable args

2018-01-22 Thread Sergey

New submission from Sergey :

MagicMock allows to check parameters of calls by using "assert_has_calls". 
However it fails if argument has a mutable type and was changed in-place before 
the second call.

The example is provided in attached file.

In "func1" value in "data" changes for each iteration and as result:
call_args_list contains two same calls.
In "func2" variable "data" generates by function "get_dict" and in this case 
call_args_list contains correct values.

Obviously it happens because class _Call 
(https://github.com/python/cpython/blob/3.5/Lib/unittest/mock.py#L1929) does 
not create a deepcopy of call args/kwargs.

Will it be correct to add deep_copy logic in mock.py ? or may be it's wrong to 
use logic like in "func1"? 
I see only one disadvantage of using deepcopy: it will become slower.

--
components: Tests
files: test.py
messages: 310476
nosy: michael.foord, skraynev
priority: normal
severity: normal
status: open
title: Mock does not create deepcopy of mutable args
versions: Python 3.5
Added file: https://bugs.python.org/file47401/test.py

___
Python tracker 
<https://bugs.python.org/issue32632>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue18305] [patch] Fast sum() for non-numbers

2013-06-25 Thread Sergey

New submission from Sergey:

Problem
===
Code:
  sum([[1,2,3]]*100, [])
takes forever to complete.

Suggestion
==
Patch sum() function so that it did not created 100 copies of result, but 
created just one. Attached patch does that.

Before patch:
$ ./python -mtimeit --setup="x=[[1,2,3]]*1" "sum(x,[])"
10 loops, best of 3: 915 msec per loop

After patch:
$ ./python -mtimeit --setup="x=[[1,2,3]]*1" "sum(x,[])"
1000 loops, best of 3: 469 usec per loop

20% boost! :)

Details
===
Built-in sum function could look like this:
  def sum(seq, start = 0):
for item in seq:
  start += item
return start

But that would be bad, becaust in cases like:
  empty = []
  result = sum(list_of_lists, empty)
content of "empty" would be modified.

So instead it looks like this:
  def sum(seq, start = 0):
for item in seq:
  start = start + item
return start
it creates a copy of the entire partial result on EVERY "start + item".

While instead it could look like this:
  def sum(seq, start = 0):
start = start + seq[0:1]
for item in seq[1:]:
  start += item
return start
create just ONE copy, and use it.

That's what attached patch is doing.

An alternative is something like this:
  def sum(seq, start = 0):
start = copy.copy(start)
for item in seq:
  start += item
return start
But I'm not sure how to make a copy of arbitrary object yet.

--
components: Interpreter Core
files: fastsum.patch
keywords: patch
messages: 191896
nosy: Sergey
priority: normal
severity: normal
status: open
title: [patch] Fast sum() for non-numbers
type: performance
versions: Python 2.7
Added file: http://bugs.python.org/file30705/fastsum.patch

___
Python tracker 
<http://bugs.python.org/issue18305>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue18305] [patch] Fast sum() for non-numbers

2013-06-28 Thread Sergey

Sergey added the comment:

> The issue of quadratic performance of sum(sequences, null_seq) is known

I hope it's never too late to fix some bugs... :)

> sum([[1,2,3]]*n, []) == [1,2,3]*n == list(chain.from_iterable([[1,2,3]]*n))

But if you already have a list of lists, and you need to join the lists 
together you have only two of those:
1. sum(list_of_lists, [])
2. list(chain.from_iterable(list_of_lists))
And using sum is much more obvious than using itertools, that most people may 
not (and don't have to) even know about.

When someone, not a python-guru, just thinks about that, she would think "so, 
I'll just add lists together, let's write a for-loop... Oh, wait, that's what 
sum() does, it adds things, and python is dynamic-type, sum() should work for 
everything". That's how I was thinking, that's how most people would think, I 
guess...

I was very surprised to find out about that bug.

> 1. People *will* move code that depends on the internal optimization to 
> pythons that do not have it.

Looks like this bug is CPython-specific, others (Jython, IronPython...) don't 
have it, so people will move code that depends on the internal optimization to 
other pythons that DO have it. :)

> 2. It discourages people from carefully thinking about whether they actually 
> need a concrete list or merely the iterator for a virtual list.

Hm... Currently people can also use iterator for sum() or list for itertools. 
Nothing changed...

> I agree with Terry. CPython deliberately disallow use sum() with lists of 
> strings.

Isn't it exactly because of this bug? I mean, if this bug gets fixed, sum would 
be as fast as join, or maybe even faster, right? So the string restriction can 
be dropped later. But that would be a separate bugreport. Anyway, the bug is 
there not just for strings, it also happens for lists, or for any other 
non-numeric objects that can be added.

PS: I was ready that my patch may not get accepted, and I'm actually thinking 
on another way of doing that (just don't know how to get a copy of arbitrary 
PyObject in C yet). But I thought that the idea itself is great: finally making 
sum() fast without any trade-offs, what could be better? Patch works at least 
for 2.7, 3.3, hg-tip and can be easily ported to any other version. I have not 
expected to get such a cold shoulder. :(

--
versions: +Python 2.7 -Python 3.4

___
Python tracker 
<http://bugs.python.org/issue18305>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue18305] [patch] Fast sum() for non-numbers

2013-07-02 Thread Sergey

Sergey added the comment:

> I don't know about IronPython, but Jython and PyPy have same behavior as 
> CPython.

Right. I was wrong, at least Jython and PyPy suffer from this bug too.

> I think that it is worthwhile to discuss the idea at first in the 
> Python-Ideas mailing list [1].

Ok, wrote to Python-Ideas, thank you.

--

___
Python tracker 
<http://bugs.python.org/issue18305>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue18305] [patch] Fast sum() for non-numbers

2013-07-04 Thread Sergey

Sergey added the comment:

This patch implements another solution to the same bug. But instead of changing 
default code it adds a special case optimization for lists, tuples and strings, 
similar to those two special cases for numbers, that are already there.

=== Lists ===
No patch:
  $ ./python -mtimeit --setup="x=[[1,2,3]]*1" "sum(x,[])"
  10 loops, best of 3: 885 msec per loop
fastsum.patch:
  $ ./python -mtimeit --setup="x=[[1,2,3]]*1" "sum(x,[])"
  1000 loops, best of 3: 524 usec per loop
fastsum-special.patch:
  $ ./python -mtimeit --setup="x=[[1,2,3]]*1" "sum(x,[])"
  1000 loops, best of 3: 298 usec per loop
Result: 3000 times faster.

=== Tuples ===
No patch:
  $ ./python -mtimeit --setup="x=[(1,2,3)]*1" "sum(x,())"
  10 loops, best of 3: 585 msec per loop
fastsum.patch:
  $ ./python -mtimeit --setup="x=[(1,2,3)]*1" "sum(x,())"
  10 loops, best of 3: 585 msec per loop
fastsum-special.patch:
  $ ./python -mtimeit --setup="x=[(1,2,3)]*1" "sum(x,())"
  1000 loops, best of 3: 536 usec per loop
Result: 1000 times faster.

=== Strings ===
No patch (just string check removed):
  $ ./python -mtimeit --setup="x=['abc']*10" "sum(x,'')"
  10 loops, best of 3: 1.52 sec per loop
fastsum.patch (+ string check removed):
  $ ./python -mtimeit --setup="x=['abc']*10" "sum(x,'')"
  10 loops, best of 3: 1.52 sec per loop
fastsum-special.patch
  $ ./python -mtimeit --setup="x=['abc']*10" "sum(x,'')"
  10 loops, best of 3: 27.8 msec per loop
join:
  $ ./python -mtimeit --setup="x=['abc']*10" "''.join(x)"
  1000 loops, best of 3: 1.66 msec per loop
Result: 50 times faster, but still constantly slower than join.

NB:
The string part of this patch is a Proof-of-Concept, just for tests, to 
demonstrate, that sum can really be O(n) for strings. It was written for python 
2.7.5, won't work for python3, and is not expected to be applied to python2, as 
it changes the behavior of sum, allowing it to sum strings.

But! If string part is stripped from the patch it works for both python2 and 
python3, and does not change existing behavior, except making sum faster.

--
Added file: http://bugs.python.org/file30769/fastsum-special.patch

___
Python tracker 
<http://bugs.python.org/issue18305>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue18305] [patch] Fast sum() for non-numbers

2013-07-11 Thread Sergey

Sergey added the comment:

Steven D'Aprano noticed that there's an obscure case of:

>>> class A(list):
... def __add__(self, other):
... return A(super(A,self).__add__(other))
... def __radd__(self, other):
... return A(other) + self
...

Where:
>>> type( [1] + A([2]) )


Is different from:

>>> type( [1].__add__(A([2])) )


To keep such an undefined behavior unchanged I updated the 
fastsum-special-tuplesandlists.patch to have a more strict type check.

In case somebody would like to test it, the patch is attached. It should work 
for both Python 2.7 and Python 3.3, and should introduce no behavior change.

--
Added file: 
http://bugs.python.org/file30897/fastsum-special-tuplesandlists.patch

___
Python tracker 
<http://bugs.python.org/issue18305>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue18305] [patch] Fast sum() for non-numbers

2013-07-12 Thread Sergey

Sergey added the comment:

> This "optimisation" is a semantic change. It breaks backward
> compatibility in cases where a = a + b and a += b do not result
> in the name a having the same value. In particular this breaks
> backward compatibility for numpy users.

I didn't knew that. Then I guess original fastsum.patch can't be used. Since 
this is not the first time when someone suggests to use __add__+__iadd__ in 
sum, I suggest to extend existing warning with your example so that future 
developers would not be tempted to follow the same approach.

Patch fastsum-iadd_warning.patch attached and can be applied to 2.7.5, 3.3.2 
and hg-tip.

Apart from this patch there're still 3 options remaining (special case in sum() 
for some types; general interface for sequence-like types; individual 
optimisation for individual types), that are to be discussed yet. Example patch 
of special case for lists and tuples attached.

--
Added file: http://bugs.python.org/file30904/fastsum-iadd_warning.patch

___
Python tracker 
<http://bugs.python.org/issue18305>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue18305] [patch] Fast sum() for non-numbers

2013-07-14 Thread Sergey

Sergey added the comment:

Attached fasttuple.py is a Proof-of-Concept implementation of tuple, that 
reuses same data storage when possible. Its possible usage looks similar to 
built-in tuples:
  from fasttuple import ft
  a = ft([1,2])
  b = a + ft([3,4])
  c = b + ft([5,6])
  d = b + ft([7,8])
  d += ft([9])
  d = ft([0]) + d + ft([0])
  print(a, b, c, d)

An interesting side-effect of this implementation is a faster __add__ operator:

Python 2.7.5:
  Adding 10 of fasttuples
  took 0.23242688179 seconds
  Adding 10 of built-in tuples
  took 25.2749021053 seconds

Python 3.3.2:
  Adding 10 of fasttuples
  took 0.2883174419403076 seconds
  Adding 10 of built-in tuples
  took 25.487935066223145 seconds

(see test() function in fasttuple.py)

This is just a proof of concept, it can be improved in different ways. Similar 
optimization can be applied to lists.

--
Added file: http://bugs.python.org/file30917/fasttuple.py

___
Python tracker 
<http://bugs.python.org/issue18305>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue18305] [patch] Fast sum() for non-numbers

2013-07-14 Thread Sergey

Sergey added the comment:

> I guess such implementation of tuple will increase memory usage
> and creation time which are critical important for tuples.

On the contrary, it will reduce memory usage and creation time compared to 
regular tuples, because in cases like:
  c = a + b
you do not have to spend time and memory for allocating and copying elements of 
"a".

The only case when it could use more memory is if you explicitly delete "c" 
after that operation. But this can be solved too, internal storage can be 
resized to a smaller value when its tail elements are not used any more.

This idea can be improved in many ways. For example it's possible to implement 
__add__ in C so that it would require no additional memory at all. But it is 
just a proof of concept, and I was trying to keep it simple, so the idea was 
easier to understand.

--

___
Python tracker 
<http://bugs.python.org/issue18305>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue18305] [patch] Fast sum() for non-numbers

2013-07-15 Thread Sergey

Sergey added the comment:

> This is not a common case. A common case is creating short tuples and keeping 
> a lot of tuples in memory.

> For fast += you need keep not only a size of tuple, but also a size of 
> allocated memory. It's a cause of sys.getsizeof([1, 2]) > sys.getsizeof((1, 
> 2)).

Agree. But is it worth worrying about? How much RAM could be taken by it? E.g. 
python test uses ~3 tuples in peak. So if we add e.g. 32 bytes to tuple 
then python test would use 106MB of RAM instead of 105MB. 1%? ;)

> You shouldn't optimize a rare case at the cost of regression in common usage.

This could optimize many cases, like instant tuple copy, or instant conversions 
of lists to tuples and back, if this idea is extended to lists. It may also 
work for strings.

--

___
Python tracker 
<http://bugs.python.org/issue18305>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue19496] Website link

2013-11-04 Thread Sergey

New submission from Sergey:

Wrong ling following to Windows help file on the Python 2.7.6 RC 1 Web page.
It is the following:
http://www.python.org/ftp/python/2.7.6/python275.chm

And should be:
http://www.python.org/ftp/python/2.7.6/python276rc1.chm

--
components: Build
messages: 202154
nosy: MolotoFF
priority: normal
severity: normal
status: open
title: Website link
versions: Python 2.7

___
Python tracker 
<http://bugs.python.org/issue19496>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue29678] email.Message.get_params decodes only first one header value

2017-02-28 Thread Sergey

New submission from Sergey:

email.Message class has method get_params() that can decode(unquote) header 
values in compliance with RFC2231 and RFC2047. But if in email message exists 
multiple headers with the same key it can't be used to decode other headers 
than first.
In my application I could use: 
   headers = message.items() 
   for key, value in headers:
   cleanValue = message.get_params(value=value)
   print(key, cleanValue)
Also have posted question on stackoverflow:
http://stackoverflow.com/questions/42502312/python-3-email-package-how-decode-given-header-value

--
components: email
messages: 288720
nosy: barry, pi314159, r.david.murray
priority: normal
severity: normal
status: open
title: email.Message.get_params decodes only first one header value
type: behavior
versions: Python 3.6

___
Python tracker 
<http://bugs.python.org/issue29678>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue29678] email.Message.get_params decodes only first one header value

2017-02-28 Thread Sergey

Changes by Sergey :


--
type: behavior -> enhancement

___
Python tracker 
<http://bugs.python.org/issue29678>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue13688] ast.literal_eval fails on octal numbers

2011-12-31 Thread Sergey Dorofeev

New submission from Sergey Dorofeev :

Python 3.2.2 (default, Nov 16 2011, 10:58:44) [C] on sunos5
Type "help", "copyright", "credits" or "license" for more information.
>>> import ast
>>> ast.literal_eval('10')
10
>>> ast.literal_eval('0x10')
16
>>> ast.literal_eval('010')
Traceback (most recent call last):
  File "", line 1, in 
  File "/opt/python322/lib/python3.2/ast.py", line 48, in literal_eval
node_or_string = parse(node_or_string, mode='eval')
  File "/opt/python322/lib/python3.2/ast.py", line 36, in parse
return compile(source, filename, mode, PyCF_ONLY_AST)
  File "", line 1
010
  ^
SyntaxError: invalid token

--
components: Library (Lib)
messages: 150414
nosy: fidoman
priority: normal
severity: normal
status: open
title: ast.literal_eval fails on octal numbers
versions: Python 3.2

___
Python tracker 
<http://bugs.python.org/issue13688>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue13688] ast.literal_eval fails on octal numbers

2011-12-31 Thread Sergey Dorofeev

Sergey Dorofeev  added the comment:

python 3 feature - should use 0o10
need to rebuild data file :(

--
resolution:  -> rejected
status: open -> closed

___
Python tracker 
<http://bugs.python.org/issue13688>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11269] cgi.FieldStorage forgets to unquote field names when parsing multipart/form-data

2011-02-25 Thread Sergey Schetinin

Sergey Schetinin  added the comment:

It does work (Python 2.7.1 here):

>>> import cgi
>>> cgi.parse_header('Content-Disposition: form-data; name=""%22"')
('Content-Disposition: form-data', {'name': '"%22'})
>>> cgi.parse_header('Content-Disposition: form-data; name="\\"%22"')
('Content-Disposition: form-data', {'name': '"%22'})

However as the unescaping is done sequential .replace, one can construct a 
header to make it unescape incorrectly:

>>> cgi.parse_header('Content-Disposition: form-data; name=""%22"')
('Content-Disposition: form-data', {'name': '"%22'})

Which should be:
('Content-Disposition: form-data', {'name': '\\"%22'})

That probably doesn't matter anyway.

--

___
Python tracker 
<http://bugs.python.org/issue11269>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11269] cgi.FieldStorage forgets to unquote field names when parsing multipart/form-data

2011-02-25 Thread Sergey Schetinin

Sergey Schetinin  added the comment:

I wanted to add that the fact that browsers encode the field names in the page 
encoding does not change that they should escape the header according to RFC 
2047.

The percent-encoding used in the field name has nothing to do with 
multipart/form-data or headers encoding or even html attribute value escaping. 
There's no reason for Chrome to percent-escape the quotation mark in the field 
name and my use of the percent sign in the field name is only to show that 
Chrome does not escape the percent sign itself and that there's no way to 
recover the data from the header sent by Chrome.

I imagine there could be a non-ASCII field name that, when encoded in some 
encoding, will produce something SQL-injection-like: '"; other="xx"'. That 
string would make the header parse into something completely different. With 
IE8 and FF 3.6 it looks like it would be very simple. The same applies to 
uploaded files names too, so it's not just a  matter of choosing sane field 
names.

That's all a browsers' problem though.

--

___
Python tracker 
<http://bugs.python.org/issue11269>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11269] cgi.FieldStorage forgets to unquote field names when parsing multipart/form-data

2011-02-25 Thread Sergey Schetinin

Sergey Schetinin  added the comment:

I don't think that's a security issue, just something that would break with 
certain filenames. And I said that it's a browsers' problem in the sense that 
it can only be fixed / caught on their side.

--

___
Python tracker 
<http://bugs.python.org/issue11269>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12098] Child process running as debug

2011-05-17 Thread Sergey Mezentsev

New submission from Sergey Mezentsev :

I run this code:
"""
from multiprocessing import Pool

def myfunc(x):
assert False
#if __debug__: print 'debug'
return x - 1

if __name__ == '__main__':
pool = Pool(processes=1)
it = pool.imap(myfunc, xrange(5)) # or imap_unordered, map
print it.next()

python -O myscript.py
"""

The myfunc() always raise AssertionError. But I run script with "-O" 
(optimization) command.

Interpreter is:
"""
Python 2.6.6 (r266:84297, Aug 24 2010, 18:46:32) [MSC v.1500 32 bit (Intel)] on 
win32
"""

Thanks!

--
components: Interpreter Core, Library (Lib), Windows
messages: 136178
nosy: thebits
priority: normal
severity: normal
status: open
title: Child process running as debug
type: behavior
versions: Python 2.6

___
Python tracker 
<http://bugs.python.org/issue12098>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12098] Child process running as debug

2011-05-17 Thread Sergey Mezentsev

Sergey Mezentsev  added the comment:

In my system (Windows 7 (64) SP1, Python 2.6.6 32-bit) I have:
"""
d:\temp>python -O pool.py
('parent optimize?', 1)
('child', 4712, 'optimize?', 0)
(Traceback (most recent call last):
'  File "new.py", line 14, in 
childpool.map(myfunc, xrange(2)) # or imap_unordered, map'
,   File "C:\Python26\lib\multiprocessing\pool.py", line 148, in map
4712, 'optimize?return self.map_async(func, iterable, chunksize).get()
'  File "C:\Python26\lib\multiprocessing\pool.py", line 422, in get
, 0)
raise self._value
AssertionError: assert False
"""

--

___
Python tracker 
<http://bugs.python.org/issue12098>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12098] Child process running as debug on Windows

2011-05-18 Thread Sergey Mezentsev

Sergey Mezentsev  added the comment:

I create patch for Popen.get_command_line() ('2.6' and 'default' branches).

I don't know how to write the test. The sys.flags structure are read only.

--
keywords: +patch
Added file: http://bugs.python.org/file22021/Issue12098.branch-2.6.patch

___
Python tracker 
<http://bugs.python.org/issue12098>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12098] Child process running as debug on Windows

2011-05-18 Thread Sergey Mezentsev

Changes by Sergey Mezentsev :


Added file: http://bugs.python.org/file22022/Issue12098.branch-default.patch

___
Python tracker 
<http://bugs.python.org/issue12098>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12098] Child process running as debug on Windows

2011-05-20 Thread Sergey Mezentsev

Changes by Sergey Mezentsev :


Removed file: http://bugs.python.org/file22021/Issue12098.branch-2.6.patch

___
Python tracker 
<http://bugs.python.org/issue12098>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12098] Child process running as debug on Windows

2011-05-20 Thread Sergey Mezentsev

Changes by Sergey Mezentsev :


Removed file: http://bugs.python.org/file22022/Issue12098.branch-default.patch

___
Python tracker 
<http://bugs.python.org/issue12098>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12098] Child process running as debug on Windows

2011-05-20 Thread Sergey Mezentsev

Changes by Sergey Mezentsev :


Added file: http://bugs.python.org/file22041/Issue12098.branch-2.6.patch

___
Python tracker 
<http://bugs.python.org/issue12098>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12098] Child process running as debug on Windows

2011-05-20 Thread Sergey Mezentsev

Changes by Sergey Mezentsev :


Added file: http://bugs.python.org/file22042/Issue12098.branch-default.patch

___
Python tracker 
<http://bugs.python.org/issue12098>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12098] Child process running as debug on Windows

2011-05-20 Thread Sergey Mezentsev

Sergey Mezentsev  added the comment:

I updated the patch.
Added a test and remove arguments for frozen interpreter.

--

___
Python tracker 
<http://bugs.python.org/issue12098>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue10880] do_mkvalue and 'boolean'

2011-01-11 Thread Sergey Shepelev

Sergey Shepelev  added the comment:

Here's patch against 2.6


--- a/Python/modsupport.c   Tue Aug 24 18:19:58 2010 +0200
+++ b/Python/modsupport.c   Tue Jan 11 23:50:40 2011 +0300
@@ -459,6 +459,16 @@
 return v;
 }
 
+case '?':
+{
+int n;
+n = va_arg(*p_va, int);
+if (n == 0)
+Py_RETURN_FALSE;
+else
+Py_RETURN_TRUE;
+}
+
 case ':':
 case ',':
 case ' ':

--
nosy: +temoto

___
Python tracker 
<http://bugs.python.org/issue10880>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11269] cgi.FieldStorage forgets to unquote field names when parsing multipart/form-data

2011-02-21 Thread Sergey Schetinin

New submission from Sergey Schetinin :

Tested on Python 2.7, but probably affects all versions. Test case is attached.

The reason this went unnoticed until now is that browsers are very conservative 
when quoting field names, so most field names are the same after their quoting.

Related bug in WebOb: https://bitbucket.org/ianb/webob/issue/2

--
components: Library (Lib)
files: tcgi.py
messages: 128950
nosy: mlk
priority: normal
severity: normal
status: open
title: cgi.FieldStorage forgets to unquote field names when parsing 
multipart/form-data
type: behavior
versions: Python 2.7
Added file: http://bugs.python.org/file20819/tcgi.py

___
Python tracker 
<http://bugs.python.org/issue11269>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11269] cgi.FieldStorage forgets to unquote field names when parsing multipart/form-data

2011-02-21 Thread Sergey Schetinin

Sergey Schetinin  added the comment:

And here's a patch.

--
keywords: +patch
Added file: http://bugs.python.org/file20820/cgi-patch.patch

___
Python tracker 
<http://bugs.python.org/issue11269>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11269] cgi.FieldStorage forgets to unquote field names when parsing multipart/form-data

2011-02-21 Thread Sergey Schetinin

Sergey Schetinin  added the comment:

I've dug into the RFCs and tested various browsers.

RFC 2388 (the one defining multipart/form-data) says: 

Field names originally in non-ASCII character sets may be encoded
within the value of the "name" parameter using the standard method
described in RFC 2047.

RFC 2047 in turn defines the coding sometimes seen in email headers 
("=?iso-8859-1?q?this is some text?=").

That means that this report is invalid. And I was misled by the bug that 
belongs to Google Chrome (which is the browser I was doing initial testing 
with).

I tested this with the following html form:


Test


Here are the headers submitted by various browsers:

IE 8: 
  Content-Disposition: form-data; name=""%22"
Firefox 4.0b11:
  Content-Disposition: form-data; name="\"%22"
Chrome 9:
  Content-Disposition: form-data; name="%22%22"

And the Chrome one is the one clearly invalid.

cgi still does no decoding of parameters as per RFC 2047, but browsers do not 
use that encoding for non-ASCII field names anyway (they just put the field 
names in UTF-8), so that might be unnecessary.

Please close this bug at your own judgement.

--

___
Python tracker 
<http://bugs.python.org/issue11269>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue3909] Building PDF documentation from tex files

2008-10-15 Thread Sergey Lipnevich

Sergey Lipnevich <[EMAIL PROTECTED]> added the comment:

I don't know what the proper procedure is (reopening or filing a new
ticket), but I see this problem with Sphinx 0.5dev-20081015 and Python
2.6 on Windows XP. The change recommended by Winfried in msg73874 seems
to fix it (patch attached). The version of MiKTeX:

MiKTeX-pdfTeX 2.7.3147 (1.40.9) (MiKTeX 2.7)
Copyright (C) 1982 D. E. Knuth, (C) 1996-2006 Han The Thanh
TeX is a trademark of the American Mathematical Society.

--
keywords: +patch
nosy: +sergey
Added file: http://bugs.python.org/file11805/sphinx.sty-issue3909.diff

___
Python tracker <[EMAIL PROTECTED]>
<http://bugs.python.org/issue3909>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue47240] Python 3.x built for ppc+ppc64 errs on: No module named 'msvcrt', '_posixsubprocess'

2022-04-06 Thread Sergey Fedorov

New submission from Sergey Fedorov :

While adding definitions for additional universal binary option (ppc+ppc64) is 
rather straightforward (following already existing examples in the source 
code), and Python 3.x after patching do build as universal for named two arch, 
trying to install any python modules fail on the following:

```
--->  Building py39-curl
Traceback (most recent call last):
  File 
"/opt/local/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/subprocess.py",
 line 73, in 
import msvcrt
ModuleNotFoundError: No module named 'msvcrt'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File 
"/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_macports_release_tarballs_ports_python_py-curl/py39-curl/work/pycurl-7.44.1/setup.py",
 line 11, in 
import glob, os, re, sys, subprocess
  File 
"/opt/local/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/subprocess.py",
 line 78, in 
import _posixsubprocess
ModuleNotFoundError: No module named '_posixsubprocess'
Command failed: 
/opt/local/Library/Frameworks/Python.framework/Versions/3.9/bin/python3.9 
setup.py docstrings
Exit code: 1
```

Separately both ppc and ppc64 are totally fine, they build and work, but not 
together.
At the same time `python27` after a similar patch builds as universal 
(ppc+ppc64) and works normally. So the problem somehow arises on 3.x versions.

P. S. I am aware that Darwin PowerPC is an interest for very few users and even 
fewer developers, so I do not expect the upstream to fix this in the code.
However I will greatly appreciate any advice which may point to what I should 
fix on my end. If I can fix Pythons for ppc+ppc64, it will benefit Macports 
users, among whom there are people using Leopard actively.
Relevant ticket on Macports: https://trac.macports.org/ticket/64916
Such fix may also benefit Linux PPC users on G5 machines.

I request moderators not to dismiss & close the issue, if possible. We don’t 
have Python experts on Macports, and all fixes for PowerPC are done by very few 
enthusiasts.

--
components: macOS
messages: 416857
nosy: barracuda156, ned.deily, ronaldoussoren
priority: normal
severity: normal
status: open
title: Python 3.x built for ppc+ppc64 errs on: No module named 'msvcrt', 
'_posixsubprocess'
type: compile error
versions: Python 3.10, Python 3.8, Python 3.9

___
Python tracker 
<https://bugs.python.org/issue47240>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37837] add internal _PyLong_FromUnsignedChar() function

2019-08-13 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

When compiled with default NSMALLPOSINTS, _PyLong_FromUnsignedChar() is 
significantly faster than other PyLong_From*():

$ python -m perf timeit -s "from collections import deque; consume = 
deque(maxlen=0).extend; b = bytes(2**20)" "consume(b)" 
--compare-to=../cpython-master/venv/bin/python
/home/sergey/tmp/cpython-master/venv/bin/python: . 7.10 ms 
+- 0.02 ms
/home/sergey/tmp/cpython-dev/venv/bin/python: . 4.29 ms +- 
0.03 ms

Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 7.10 ms +- 
0.02 ms -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 4.29 ms +- 0.03 ms: 
1.66x faster (-40%)

It's mostly useful for bytes/bytearray, but also can be used in several other 
places.

--
components: Interpreter Core
messages: 349540
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: add internal _PyLong_FromUnsignedChar() function
type: performance
versions: Python 3.9

___
Python tracker 
<https://bugs.python.org/issue37837>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37837] add internal _PyLong_FromUnsignedChar() function

2019-08-13 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +14971
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/15251

___
Python tracker 
<https://bugs.python.org/issue37837>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37840] bytearray_getitem() handles negative index incorrectly

2019-08-13 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

bytearray_getitem() adjusts negative index, though that's already done by 
PySequence_GetItem().
This makes PySequence_GetItem(bytearray(1), -2) to return 0 instead of raise 
IndexError.

--
components: Interpreter Core
messages: 349545
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: bytearray_getitem() handles negative index incorrectly
type: behavior

___
Python tracker 
<https://bugs.python.org/issue37840>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37840] bytearray_getitem() handles negative index incorrectly

2019-08-13 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +14972
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/15250

___
Python tracker 
<https://bugs.python.org/issue37840>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37842] Initialize Py_buffer variables more efficiently

2019-08-13 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

Argument Clinic generates `{NULL, NULL}` initializer for Py_buffer variables. 
Such initializer zeroes all Py_buffer members, but as I understand only `obj` 
and `buf` members are really had to be initialized. Avoiding unneeded 
initialization provides tiny speed-up:

$ python -m perf timeit -s "replace = b''.replace" "replace(b'', b'')" 
--compare-to=../cpython-master/venv/bin/python --duplicate=1000
/home/sergey/tmp/cpython-master/venv/bin/python: ..... 43.0 ns 
+- 0.5 ns
/home/sergey/tmp/cpython-dev/venv/bin/python: . 41.8 ns +- 
0.4 ns

Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 43.0 ns +- 
0.5 ns -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 41.8 ns +- 0.4 ns: 
1.03x faster (-3%)

--
components: Argument Clinic
messages: 349582
nosy: larry, sir-sigurd
priority: normal
severity: normal
status: open
title: Initialize Py_buffer variables more efficiently
type: performance
versions: Python 3.9

___
Python tracker 
<https://bugs.python.org/issue37842>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37842] Initialize Py_buffer variables more efficiently

2019-08-13 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +14975
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/15254

___
Python tracker 
<https://bugs.python.org/issue37842>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37907] speed-up PyLong_As*() for large longs

2019-08-21 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

PyLong_As*() functions computes result for large longs like this:

size_t x, prev;
x = 0;
while (--i >= 0) {
prev = x;
x = (x << PyLong_SHIFT) | v->ob_digit[i];
if ((x >> PyLong_SHIFT) != prev) {
*overflow = sign;
goto exit;
}
}

It can be rewritten like this:

size_t x = 0;
while (--i >= 0) {
if (x > (size_t)-1 >> PyLong_SHIFT) {
goto overflow;
}
x = (x << PyLong_SHIFT) | v->ob_digit[i];
}

This provides some speed-up:

PyLong_AsSsize_t()
$ python -m perf timeit -s "from struct import Struct; N = 1000; pack = 
Struct('n'*N).pack; values = (2**30,)*N" "pack(*values)" 
--compare-to=../cpython-master/venv/bin/python
/home/sergey/tmp/cpython-master/venv/bin/python: . 9.69 us 
+- 0.02 us
/home/sergey/tmp/cpython-dev/venv/bin/python: ..... 8.61 us +- 
0.07 us
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 9.69 us +- 
0.02 us -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 8.61 us +- 0.07 us: 
1.12x faster (-11%)

PyLong_AsSize_t()
$ python -m perf timeit -s "from struct import Struct; N = 1000; pack = 
Struct('N'*N).pack; values = (2**30,)*N" "pack(*values)" 
--compare-to=../cpython-master/venv/bin/python
/home/sergey/tmp/cpython-master/venv/bin/python: . 10.5 us 
+- 0.1 us
/home/sergey/tmp/cpython-dev/venv/bin/python: . 8.19 us +- 
0.17 us
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 10.5 us +- 
0.1 us -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 8.19 us +- 0.17 us: 
1.29x faster (-22%)

PyLong_AsLong()
$ python -m perf timeit -s "from struct import Struct; N = 1000; pack = 
Struct('l'*N).pack; values = (2**30,)*N" "pack(*values)" 
--compare-to=../cpython-master/venv/bin/python
/home/sergey/tmp/cpython-master/venv/bin/python: ..... 9.68 us 
+- 0.02 us
/home/sergey/tmp/cpython-dev/venv/bin/python: ..... 8.48 us +- 
0.22 us
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 9.68 us +- 
0.02 us -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 8.48 us +- 0.22 us: 
1.14x faster (-12%)

PyLong_AsUnsignedLong()
$ python -m perf timeit -s "from struct import Struct; N = 1000; pack = 
Struct('L'*N).pack; values = (2**30,)*N" "pack(*values)" 
--compare-to=../cpython-master/venv/bin/python
/home/sergey/tmp/cpython-master/venv/bin/python: . 10.5 us 
+- 0.1 us
/home/sergey/tmp/cpython-dev/venv/bin/python: . 8.41 us +- 
0.26 us
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 10.5 us +- 
0.1 us -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 8.41 us +- 0.26 us: 
1.25x faster (-20%)

The mentioned pattern is also used in PyLong_AsLongLongAndOverflow(), but I 
left it untouched since the proposed change doesn't seem to affect its 
performance.

--
components: Interpreter Core
messages: 350091
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: speed-up PyLong_As*() for large longs
type: performance
versions: Python 3.9

___
Python tracker 
<https://bugs.python.org/issue37907>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37907] speed-up PyLong_As*() for large longs

2019-08-21 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +15074
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/15363

___
Python tracker 
<https://bugs.python.org/issue37907>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue27961] remove support for platforms without "long long"

2019-08-22 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
pull_requests: +15094
pull_request: https://github.com/python/cpython/pull/15385

___
Python tracker 
<https://bugs.python.org/issue27961>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue27961] remove support for platforms without "long long"

2019-08-22 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
pull_requests: +15095
pull_request: https://github.com/python/cpython/pull/15386

___
Python tracker 
<https://bugs.python.org/issue27961>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue27961] remove support for platforms without "long long"

2019-08-22 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
pull_requests: +15098
pull_request: https://github.com/python/cpython/pull/15388

___
Python tracker 
<https://bugs.python.org/issue27961>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37938] refactor PyLong_As*() functions

2019-08-24 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +15152
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/15457

___
Python tracker 
<https://bugs.python.org/issue37938>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37938] refactor PyLong_As*() functions

2019-08-24 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

PyLong_As*() functions have a lot of duplicated code, creating draft PR with 
attempt to deduplicate them.

--
components: Interpreter Core
messages: 350367
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: refactor PyLong_As*() functions

___
Python tracker 
<https://bugs.python.org/issue37938>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37837] add internal _PyLong_FromUnsignedChar() function

2019-08-24 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

$ gcc -v 2>&1 | grep 'gcc version'
gcc version 8.3.0 (Debian 8.3.0-19)

using ./configure --enable-optimizations --with-lto
$ python -m perf timeit -s "from collections import deque; consume = 
deque(maxlen=0).extend; b = bytes(2**20)" "consume(b)" 
--compare-to=../cpython-master/venv/bin/python
/home/sergey/tmp/cpython-master/venv/bin/python: ..... 6.71 ms 
+- 0.09 ms
/home/sergey/tmp/cpython-dev/venv/bin/python: . 6.71 ms +- 
0.08 ms
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 6.71 ms +- 
0.09 ms -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 6.71 ms +- 0.08 ms: 
1.00x slower (+0%)

using ./configure --enable-optimizations
$ python -m perf timeit -s "from collections import deque; consume = 
deque(maxlen=0).extend; b = bytes(2**20)" "consume(b)" 
--compare-to=../cpython-master/venv/bin/python
/home/sergey/tmp/cpython-master/venv/bin/python: . 6.73 ms 
+- 0.17 ms
/home/sergey/tmp/cpython-dev/venv/bin/python: ..... 4.28 ms +- 
0.01 ms
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 6.73 ms +- 
0.17 ms -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 4.28 ms +- 0.01 ms: 
1.57x faster (-36%)

--

___
Python tracker 
<https://bugs.python.org/issue37837>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37802] micro-optimization of PyLong_FromSize_t()

2019-08-25 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

Previous benchmarks results were obtained with non-LTO build.

Here are results for LTO build:

$ python -m perf timeit -s "from itertools import repeat; _len = repeat(None, 
0).__length_hint__" "_len()" --compare-to=../cpython-master/venv/bin/python 
--duplicate=1000
/home/sergey/tmp/cpython-master/venv/bin/python: . 14.9 ns 
+- 0.2 ns
/home/sergey/tmp/cpython-dev/venv/bin/python: . 13.1 ns +- 
0.5 ns
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 14.9 ns +- 
0.2 ns -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 13.1 ns +- 0.5 ns: 
1.13x faster (-12%)

$ python -m perf timeit -s "from itertools import repeat; _len = repeat(None, 
2**10).__length_hint__" "_len()" --compare-to=../cpython-master/venv/bin/python 
--duplicate=1000
/home/sergey/tmp/cpython-master/venv/bin/python: ..... 22.1 ns 
+- 0.1 ns
/home/sergey/tmp/cpython-dev/venv/bin/python: . 20.9 ns +- 
0.4 ns
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 22.1 ns +- 
0.1 ns -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 20.9 ns +- 0.4 ns: 
1.05x faster (-5%)

$ python -m perf timeit -s "from itertools import repeat; _len = repeat(None, 
2**30).__length_hint__" "_len()" --compare-to=../cpython-master/venv/bin/python 
--duplicate=1000
/home/sergey/tmp/cpython-master/venv/bin/python: . 23.3 ns 
+- 0.0 ns
/home/sergey/tmp/cpython-dev/venv/bin/python: . 21.6 ns +- 
0.1 ns
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 23.3 ns +- 
0.0 ns -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 21.6 ns +- 0.1 ns: 
1.08x faster (-8%)

$ python -m perf timeit -s "from itertools import repeat; _len = repeat(None, 
2**60).__length_hint__" "_len()" --compare-to=../cpython-master/venv/bin/python 
--duplicate=1000
/home/sergey/tmp/cpython-master/venv/bin/python: . 24.4 ns 
+- 0.1 ns
/home/sergey/tmp/cpython-dev/venv/bin/python: . 22.7 ns +- 
0.1 ns
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 24.4 ns +- 
0.1 ns -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 22.7 ns +- 0.1 ns: 
1.08x faster (-7%)

--

___
Python tracker 
<https://bugs.python.org/issue37802>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37837] add internal _PyLong_FromUnsignedChar() function

2019-08-25 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

These last results are invalid :-)

I thought that I was checking _PyLong_FromUnsignedChar() on top of GH-15192, 
but that wasn't true. So the correct results for LTO build are:

$ python -m perf timeit -s "from collections import deque; consume = 
deque(maxlen=0).extend; b = bytes(2**20)" "consume(b)" 
--compare-to=../cpython-master/venv/bin/python
/home/sergey/tmp/cpython-master/venv/bin/python: . 6.93 ms 
+- 0.04 ms
/home/sergey/tmp/cpython-dev/venv/bin/python: . 3.96 ms +- 
0.01 ms
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 6.93 ms +- 
0.04 ms -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 3.96 ms +- 0.01 ms: 
1.75x faster (-43%)

But the most important thing is that using PyLong_FromUnsignedLong() instead of 
_PyLong_FromUnsignedChar() on top of GH-15192 is producing the same results: 
striter_next() uses small_ints[] directly. However that's not true for 
bytearrayiter_next(): PyLong_FromUnsignedLong() is called there. I think that's 
due to how code is profiled so I'm satisfied with these results more or less.

I'm closing existing PR and probably will close this issue soon after GH-15192 
will be merged.

--

___
Python tracker 
<https://bugs.python.org/issue37837>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37973] improve docstrings of sys.float_info

2019-08-28 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

In [8]: help(sys.float_info)

...

 |  
 |  dig
 |  DBL_DIG -- digits
 |

This is not very helpful, 
https://docs.python.org/3/library/sys.html#sys.float_info is more verbose, so 
probably docstrings should be updated from where.

--
assignee: docs@python
components: Documentation
messages: 350703
nosy: docs@python, sir-sigurd
priority: normal
severity: normal
status: open
title: improve docstrings of sys.float_info
type: enhancement

___
Python tracker 
<https://bugs.python.org/issue37973>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37974] zip() docstring should say 'iterator' instead of 'object with __next__()'

2019-08-28 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

In [3]: help(zip)

class zip(object)
 |  zip(*iterables) --> zip object
 |  
 |  Return a zip object whose .__next__() method returns a tuple where
 |  the i-th element comes from the i-th iterable argument.  The .__next__()
 |  method continues until the shortest iterable in the argument sequence
 |  is exhausted and then it raises StopIteration.

This description is awkward and should use term 'iterator' as 
https://docs.python.org/3/library/functions.html#zip does.

The same applies to chain(), count() and zip_longest() from itertools.

--
assignee: docs@python
components: Documentation
messages: 350704
nosy: docs@python, sir-sigurd
priority: normal
severity: normal
status: open
title: zip() docstring should say 'iterator' instead of 'object with __next__()'

___
Python tracker 
<https://bugs.python.org/issue37974>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37976] zip() shadows TypeError raised in __iter__() of source iterable

2019-08-29 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

zip() shadows TypeError raised in __iter__() of source iterable:

In [21]: class Iterable:
...: def __init__(self, n):
...: self.n = n
...: def __iter__(self):
...: return iter(range(self.n))
...: 

In [22]: zip(Iterable('one'))
---
TypeError Traceback (most recent call last)
 in ()
> 1 zip(Iterable('one'))

TypeError: zip argument #1 must support iteration

--
messages: 350763
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: zip() shadows TypeError raised in __iter__() of source iterable

___
Python tracker 
<https://bugs.python.org/issue37976>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37976] zip() shadows TypeError raised in __iter__() of source iterable

2019-08-29 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +15268
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/15592

___
Python tracker 
<https://bugs.python.org/issue37976>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37976] zip() shadows TypeError raised in __iter__() of source iterable

2019-08-29 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

Maybe it's not clear from description, but traceback only show the line with 
zip(), so it doesn't help at localizing the source of exception at all.

You only see that 'argument #N must support iteration', but that argument  has 
__iter__() i.e. it supports iteration.

--

___
Python tracker 
<https://bugs.python.org/issue37976>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37976] zip() shadows TypeError raised in __iter__() of source iterable

2019-08-29 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

Also using this example class:

In [5]: iter(Iterable('one'))
---
TypeError Traceback (most recent call last)
 in ()
> 1 iter(Iterable('one'))

 in __iter__(self)
  3 self.n = n
  4 def __iter__(self):
> 5 return iter(range(self.n))
  6 

TypeError: range() integer end argument expected, got str.

--

___
Python tracker 
<https://bugs.python.org/issue37976>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37976] zip() shadows TypeError raised in __iter__() of source iterable

2019-08-29 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

> map() does not have anything special.

Just for the reference, in Python 2.7 map() has the same behavior and it caused 
many problems for me and other people working with/developing Django.

--

___
Python tracker 
<https://bugs.python.org/issue37976>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37986] Improve perfomance of PyLong_FromDouble()

2019-08-30 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

This patch simplifies fast path for floats that fit into C long and moves it 
from float.__trunc__ to PyLong_FromDouble().

+-+-+--+
| Benchmark   | long-from-float-ref | long-from-float  |
+=+=+==+
| int(1.) | 39.5 ns | 37.3 ns: 1.06x faster (-6%)  |
+-+-+--+
| int(2.**20) | 46.4 ns | 45.6 ns: 1.02x faster (-2%)  |
+-+-+--+
| int(2.**30) | 52.5 ns | 49.0 ns: 1.07x faster (-7%)  |
+-+-+--+
| int(2.**60) | 50.0 ns | 49.2 ns: 1.02x faster (-2%)  |
+-+-+--+
| int(-2.**63)| 76.6 ns | 48.6 ns: 1.58x faster (-37%) |
+-+-+--+
| int(2.**80) | 77.1 ns | 72.5 ns: 1.06x faster (-6%)  |
+-+-+--+
| int(2.**120)| 91.5 ns | 87.7 ns: 1.04x faster (-4%)  |
+-+-+--+
| math.ceil(1.)   | 57.4 ns | 32.9 ns: 1.74x faster (-43%) |
+-+-+--+
| math.ceil(2.**20)   | 60.5 ns | 41.3 ns: 1.47x faster (-32%) |
+-+-+--+
| math.ceil(2.**30)   | 64.2 ns | 43.9 ns: 1.46x faster (-32%) |
+-+-+--+
| math.ceil(2.**60)   | 66.3 ns | 42.3 ns: 1.57x faster (-36%) |
+-+-+--+
| math.ceil(-2.**63)  | 67.7 ns | 43.1 ns: 1.57x faster (-36%) |
+-+-+--+
| math.ceil(2.**80)   | 66.6 ns | 65.6 ns: 1.01x faster (-1%)  |
+-+-+--+
| math.ceil(2.**120)  | 79.9 ns | 80.5 ns: 1.01x slower (+1%)  |
+-+-+--+
| math.floor(1.)  | 58.4 ns | 31.2 ns: 1.87x faster (-47%) |
+-+-+--+
| math.floor(2.**20)  | 61.0 ns | 39.6 ns: 1.54x faster (-35%) |
+-+-+--+
| math.floor(2.**30)  | 64.2 ns | 43.9 ns: 1.46x faster (-32%) |
+-+-+--+
| math.floor(2.**60)  | 62.1 ns | 40.1 ns: 1.55x faster (-35%) |
+-+-+--+
| math.floor(-2.**63) | 64.1 ns | 39.9 ns: 1.61x faster (-38%) |
+-+-+--+
| math.floor(2.**80)  | 62.2 ns | 62.7 ns: 1.01x slower (+1%)  |
+-+-+--+
| math.floor(2.**120) | 77.0 ns | 77.8 ns: 1.01x slower (+1%)  |
+-+-+--+

I'm going to speed-up conversion of larger floats in a follow-up PR.

--
components: Interpreter Core
files: bench-long-from-float.py
messages: 350861
nosy: sir-sigurd
priority: normal
pull_requests: 15285
severity: normal
status: open
title: Improve perfomance of PyLong_FromDouble()
type: performance
versions: Python 3.9
Added file: https://bugs.python.org/file48573/bench-long-from-float.py

___
Python tracker 
<https://bugs.python.org/issue37986>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38015] inline function generates slightly inefficient machine code

2019-09-06 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
pull_requests: +15372
pull_request: https://github.com/python/cpython/pull/15718

___
Python tracker 
<https://bugs.python.org/issue38015>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38015] inline function generates slightly inefficient machine code

2019-09-06 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

I added similar patch that replaces get_small_int() with macro version, since 
it also induces unnecessary casts and makes machine code less efficient.

Example assembly can be checked at https://godbolt.org/z/1SjG3E.

This change produces tiny, but measurable speed-up for handling small ints:

$ python -m pyperf timeit -s "from collections import deque; consume = 
deque(maxlen=0).extend; r = range(256)" "consume(r)" 
--compare-to=../cpython-master/venv/bin/python --duplicate=1000
/home/sergey/tmp/cpython-master/venv/bin/python: . 1.03 us 
+- 0.08 us
/home/sergey/tmp/cpython-dev/venv/bin/python: . 973 ns +- 
18 ns

Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 1.03 us +- 
0.08 us -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 973 ns +- 18 ns: 
1.05x faster (-5%)

--

___
Python tracker 
<https://bugs.python.org/issue38015>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38015] inline function generates slightly inefficient machine code

2019-09-07 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

I use GCC 9.2.

--

___
Python tracker 
<https://bugs.python.org/issue38015>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38079] _PyObject_VAR_SIZE should avoid arithmetic overflow

2019-09-09 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +15471
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/14838

___
Python tracker 
<https://bugs.python.org/issue38079>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38094] unneeded assignment to wb.len in PyBytes_Concat using buffer protocol

2019-09-10 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +15517
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/15274

___
Python tracker 
<https://bugs.python.org/issue38094>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38147] add macro for __builtin_unreachable

2019-09-12 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

GCC (along with Clang and ICC) has __builtin_unreachable() and MSVC has 
__assume() builtins, that can be used to optimize out unreachable conditions.

So we could add macro like this:

#ifdef Py_DEBUG
#   define Py_ASSUME(cond) (assert(cond))
#else
#   if defined(_MSC_VER)
#   define Py_ASSUME(cond) (__assume(cond))
#   elif defined(__GNUC__)
#   define Py_ASSUME(cond) (cond? (void)0: __builtin_unreachable())
#   else
#   define Py_ASSUME(cond) ((void)0);
#   endif
#endif

Here's a pair of really simple examples showing how it can optimize code: 
https://godbolt.org/z/g9LYXF.

Real world example. _PyLong_Copy() [1] calls _PyLong_New() [2]. _PyLong_New() 
checks the size, so that overflow does not occur. This check is redundant when 
_PyLong_New() is called from _PyLong_Copy(). We could add a function that 
bypass that check, but in LTO build PyObject_MALLOC() is inlined into 
_PyLong_New() and it also checks the size. Adding Py_ASSUME((size_t)size <= 
MAX_LONG_DIGITS) allows to bypass both checks. 

[1] 
https://github.com/python/cpython/blob/3a4f66707e824ef3a8384827590ebaa6ca463dc0/Objects/longobject.c#L287-L309
[2] 
https://github.com/python/cpython/blob/3a4f66707e824ef3a8384827590ebaa6ca463dc0/Objects/longobject.c#L264-L283

--
messages: 352228
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: add macro for __builtin_unreachable

___
Python tracker 
<https://bugs.python.org/issue38147>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38205] Python no longer compiles without small integer singletons

2019-09-17 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

I believe that the problem is caused by the change in Py_UNREACHABLE() 
(https://github.com/python/cpython/commit/3ab61473ba7f3dca32d779ec2766a4faa0657923).

Before the mentioned commit Py_UNREACHABLE() was an expression, now it's a 
block. Py_UNREACHABLE() macro is public (see 
https://docs.python.org/3/c-api/intro.html#c.Py_UNREACHABLE), so this change 
can cause similar problems outside of CPython (i.e. that change was breaking).

--
nosy: +sir-sigurd

___
Python tracker 
<https://bugs.python.org/issue38205>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38205] Python no longer compiles without small integer singletons

2019-09-17 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

Also quote from Py_UNREACHABLE() doc:

>  Use this in places where you might be tempted to put an assert(0) or abort() 
> call.

https://github.com/python/cpython/commit/6b519985d23bd0f0bd072b5d5d5f2c60a81a19f2
 does exactly that, it replaces assert(0) with Py_UNREACHABLE().

--

___
Python tracker 
<https://bugs.python.org/issue38205>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38211] clean up type_init()

2019-09-18 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

I wrote patch that cleans up type_init():

1. Removes conditions already checked by assert()
2. Removes object_init() call that effectively creates an empty tuple and 
checks that this tuple is empty

--
messages: 352710
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: clean up type_init()

___
Python tracker 
<https://bugs.python.org/issue38211>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38211] clean up type_init()

2019-09-18 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +15852
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/16257

___
Python tracker 
<https://bugs.python.org/issue38211>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38205] Py_UNREACHABLE() no longer behaves as a function call

2019-09-19 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

FWIW I proposed to add Py_ASSUME() macro that uses __builtin_unreachable() in 
bpo-38147.

--

___
Python tracker 
<https://bugs.python.org/issue38205>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38147] add macro for __builtin_unreachable

2019-09-19 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

> If you care of _PyLong_Copy() performance, you should somehow manually inline 
> _PyLong_New() inside _PyLong_Copy().

It doesn't solve this:

> We could add a function that bypass that check, but in LTO build 
> PyObject_MALLOC() is inlined into _PyLong_New() and it also checks the size. 
> Adding Py_ASSUME((size_t)size <= MAX_LONG_DIGITS) allows to bypass both 
> checks.

Here's example: 
https://github.com/sir-sigurd/cpython/commit/c8699d0c614a18d558216ae7d432107147c95c28.

I attach some disassembly from this example compiled with LTO, to demonstrate 
how the proposed macro affects generated code.

--
Added file: https://bugs.python.org/file48614/disasm.txt

___
Python tracker 
<https://bugs.python.org/issue38147>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue35696] remove unnecessary operation in long_compare()

2019-09-20 Thread Sergey Fedoseev

Sergey Fedoseev  added the comment:

These warnings are caused by 
https://github.com/python/cpython/commit/c6734ee7c55add5fdc2c821729ed5f67e237a096.

I'd fix them, but I'm not sure if we are going to restore CHECK_SMALL_INT() 
¯\_(ツ)_/¯

--
nosy: +sir-sigurd

___
Python tracker 
<https://bugs.python.org/issue35696>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38517] functools.cached_property should support partial functions and partialmethod's

2019-10-22 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

issue38524 is related.

--
nosy: +sir-sigurd

___
Python tracker 
<https://bugs.python.org/issue38517>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38524] functools.cached_property is not supported for setattr

2019-10-22 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
nosy: +sir-sigurd

___
Python tracker 
<https://bugs.python.org/issue38524>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue15907] move doctest test-data files into a subdirectory of Lib/test

2021-05-06 Thread Sergey Polischouck


Change by Sergey Polischouck :


--
nosy: +polischouckserg

___
Python tracker 
<https://bugs.python.org/issue15907>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue44071] Syntax error in Python3 documentation

2021-05-07 Thread Sergey Maslyakov


New submission from Sergey Maslyakov :

https://docs.python.org/3/library/subprocess.html#subprocess.check_output

The code sample seems to have a misplaced closing round bracket. It should go 
after "stdout"

```
run(..., check=True, stdout=PIPE).stdout
```

--
assignee: docs@python
components: Documentation
messages: 393222
nosy: docs@python, evolvah
priority: normal
severity: normal
status: open
title: Syntax error in Python3 documentation
versions: Python 3.6

___
Python tracker 
<https://bugs.python.org/issue44071>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue44071] Syntax error in Python3 documentation

2021-05-07 Thread Sergey Maslyakov


Sergey Maslyakov  added the comment:

Thank you, Dennis! I was wrong. Closing the ticket.

--
resolution:  -> not a bug
stage:  -> resolved
status: open -> closed

___
Python tracker 
<https://bugs.python.org/issue44071>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue39759] os.getenv documentation is misleading

2021-07-02 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
nosy: +sir-sigurd

___
Python tracker 
<https://bugs.python.org/issue39759>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue44022] urllib http client possible infinite loop on a 100 Continue response

2021-07-05 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
nosy: +sir-sigurd
nosy_count: 8.0 -> 9.0
pull_requests: +25593
pull_request: https://github.com/python/cpython/pull/27033

___
Python tracker 
<https://bugs.python.org/issue44022>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue45818] socketserver.BaseRequestHandler inherited class

2021-11-16 Thread Sergey M.


New submission from Sergey M. :

Due to 
```python
try:
self.handle()
finally:
self.finish()
```
construct in `socketserver.BaseRequestHandler.__init__()` method inherited 
classes with `overrided __init__()` method may suffer from incomplete 
initialization.
For example, in the following snippet
```python
def __init__(self, request, client_address, server):
super().__init__(request, client_address, server)
self.foo = 1
```
in some cases all the code after `super()` call will not be executed.

This is a MWE of the server with partially initialized Handler class
```python
from socketserver import UnixStreamServer, StreamRequestHandler, ForkingMixIn


class Handler(StreamRequestHandler):
def __init__(self, request, client_address, server):
super().__init__(request, client_address, server)
self.foo = 1

def handle(self):
print(self.foo)


class ThreadedUnixStreamServer(ForkingMixIn, UnixStreamServer):
pass


with ThreadedUnixStreamServer("/tmp/test.socket", Handler) as server:
server.serve_forever()
```

--
components: Library (Lib)
messages: 406413
nosy: matsievskiysv
priority: normal
severity: normal
status: open
title: socketserver.BaseRequestHandler inherited class
type: behavior
versions: Python 3.9

___
Python tracker 
<https://bugs.python.org/issue45818>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37347] Reference-counting problem in sqlite

2019-11-28 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
pull_requests: +16894
pull_request: https://github.com/python/cpython/pull/17413

___
Python tracker 
<https://bugs.python.org/issue37347>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue27961] remove support for platforms without "long long"

2019-12-09 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
pull_requests: +17021
pull_request: https://github.com/python/cpython/pull/17539

___
Python tracker 
<https://bugs.python.org/issue27961>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40302] Add pycore_byteswap.h internal header file with _Py_bswap32() function

2020-05-10 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
nosy: +sir-sigurd
nosy_count: 3.0 -> 4.0
pull_requests: +19338
pull_request: https://github.com/python/cpython/pull/15659

___
Python tracker 
<https://bugs.python.org/issue40302>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41141] remove unneeded handling of '.' and '..' from patlib.Path.iterdir()

2020-06-27 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

Currently patlib.Path.iterdir() filters out '.' and '..'. 
It's unneeded since patlib.Path.iterdir() uses os.listdir() under the hood, 
which returns neither '.' nor '..'.
https://docs.python.org/3/library/os.html#os.listdir

--
components: Library (Lib)
messages: 372465
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: remove unneeded handling of '.' and '..' from patlib.Path.iterdir()

___
Python tracker 
<https://bugs.python.org/issue41141>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41141] remove unneeded handling of '.' and '..' from patlib.Path.iterdir()

2020-06-27 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +20336
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/21179

___
Python tracker 
<https://bugs.python.org/issue41141>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41415] duplicated signature of dataclass in help()

2020-07-27 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

In [191]: import dataclasses, pydoc

In [192]: @dataclass
 ...: class C:
 ...: pass
 ...:

In [193]: print(pydoc.render_doc(C))
Python Library Documentation: class C in module __main__

class C(builtins.object)
 |  C() -> None
 |
 |  C()
 |
 |  Methods defined here:
 |
 


It's duplicated because dataclass __doc__ defaults to signature:
 
In [195]: C.__doc__  
Out[195]: 'C()'

--
components: Library (Lib)
messages: 374461
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: duplicated signature of dataclass in help()
type: behavior
versions: Python 3.10

___
Python tracker 
<https://bugs.python.org/issue41415>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41415] duplicated signature of dataclass in help()

2020-07-27 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +20793
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/21652

___
Python tracker 
<https://bugs.python.org/issue41415>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26834] Add truncated SHA512/224 and SHA512/256

2020-08-07 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
nosy: +sir-sigurd

___
Python tracker 
<https://bugs.python.org/issue26834>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue35276] Document thread safety

2020-08-24 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
nosy: +sir-sigurd

___
Python tracker 
<https://bugs.python.org/issue35276>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33239] tempfile module: functions with the 'buffering' option are incorrectly documented

2020-09-13 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
nosy: +sir-sigurd
nosy_count: 4.0 -> 5.0
pull_requests: +21279
pull_request: https://github.com/python/cpython/pull/21763

___
Python tracker 
<https://bugs.python.org/issue33239>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38905] venv python reports wrong sys.executable in a subprocess on Windows

2020-10-08 Thread Sergey Nudnou


Sergey Nudnou  added the comment:

Hello,

I've just run into a related issue. I have a python script, which starts an 
another python script using subprocess.Popen(). The parent script gets pid of 
the child and monitors up its activity through a database by this pid. The 
child script updates its activity in the database using the pid it gotten from 
os.getpid()

Both scripts live in a virtual environment.

It worked fine in Python 3.5 and stopped working after migration to Python 3.8.

My base Python location: D:\Python\Python38\pythonw.exe
Virtual Environment: D:\test\venv\Scripts\pythonw.exe

I have realized, that when I run the following from a command prompt:
D:\test\venv\Scripts\pythonw.exe test.py

2 processes with the different PIDs are created:
PID:   97040
Parent PID:12004  (cmd.exe)
Command Line:  D:\test\venv\Scripts\pythonw.exe test.py

PID:   85548
Parent PID:97040  (pythonw.exe)
Command Line:  D:\Python\Python38\pythonw.exe test.py

It is definitely a regression, and will potentially break a lot of applications 
expecting a child Python process to be a direct descendant of its parent.

Also it is a waste of system resources

--
nosy: +nsmcan

___
Python tracker 
<https://bugs.python.org/issue38905>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue14313] zipfile does not unpack files from archive (files extracted have zero length)

2012-03-14 Thread Sergey Dorofeev

New submission from Sergey Dorofeev :

unzip does extract files but zipfile no
works fine with python2.7 but fails with python 3.2.2
tested on solaris 11 express and windows xp

>>> import zipfile
>>> zipfile.ZipFile("test.zip")

>>> z=_
>>> z.namelist
>
>>> z.namelist()
['19A7B5A4.PKT']
>>> z.read('19A7B5A4.PKT')
b''

--
components: Library (Lib)
files: test.zip
messages: 155854
nosy: fidoman
priority: normal
severity: normal
status: open
title: zipfile does not unpack files from archive (files extracted have zero 
length)
type: behavior
versions: Python 3.2
Added file: http://bugs.python.org/file24857/test.zip

___
Python tracker 
<http://bugs.python.org/issue14313>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue31862] Port the standard library to PEP 489 multiphase initialization

2019-05-22 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
pull_requests: +13420

___
Python tracker 
<https://bugs.python.org/issue31862>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue34488] improve performance of BytesIO.writelines() by avoiding creation of unused PyLongs

2019-08-02 Thread Sergey Fedoseev


Sergey Fedoseev  added the comment:

`BytesIO.write()` and `BytesIO.writelines()` are independent of each other.

--

___
Python tracker 
<https://bugs.python.org/issue34488>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37802] micro-optimization of PyLong_FromSize_t()

2019-08-09 Thread Sergey Fedoseev


New submission from Sergey Fedoseev :

Currently PyLong_FromSize_t() uses PyLong_FromLong() for values < PyLong_BASE. 
It's suboptimal because PyLong_FromLong() needs to handle the sign. Removing 
PyLong_FromLong() call and handling small ints directly in PyLong_FromSize_t() 
makes it faster:

$ python -m perf timeit -s "from itertools import repeat; _len = repeat(None, 
2).__length_hint__" "_len()" --compare-to=../cpython-master/venv/bin/python 
--duplicate=1
/home/sergey/tmp/cpython-master/venv/bin/python: ..... 18.7 ns 
+- 0.3 ns
/home/sergey/tmp/cpython-dev/venv/bin/python: . 16.7 ns +- 
0.1 ns
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 18.7 ns +- 
0.3 ns -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 16.7 ns +- 0.1 ns: 
1.12x faster (-10%)

$ python -m perf timeit -s "from itertools import repeat; _len = repeat(None, 
2**10).__length_hint__" "_len()" --compare-to=../cpython-master/venv/bin/python 
--duplicate=1
/home/sergey/tmp/cpython-master/venv/bin/python: ..... 26.2 ns 
+- 0.0 ns
/home/sergey/tmp/cpython-dev/venv/bin/python: ..... 25.0 ns +- 
0.7 ns
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 26.2 ns +- 
0.0 ns -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 25.0 ns +- 0.7 ns: 
1.05x faster (-5%)

$ python -m perf timeit -s "from itertools import repeat; _len = repeat(None, 
2**30).__length_hint__" "_len()" --compare-to=../cpython-master/venv/bin/python 
--duplicate=1
/home/sergey/tmp/cpython-master/venv/bin/python: . 25.6 ns 
+- 0.1 ns
/home/sergey/tmp/cpython-dev/venv/bin/python: . 25.6 ns +- 
0.0 ns
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 25.6 ns +- 
0.1 ns -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 25.6 ns +- 0.0 ns: 
1.00x faster (-0%)


This change makes PyLong_FromSize_t() consistently faster than 
PyLong_FromSsize_t(). So it might make sense to replace PyLong_FromSsize_t() 
with PyLong_FromSize_t() in __length_hint__() implementations and other similar 
cases. For example:

$ python -m perf timeit -s "_len = iter(bytes(2)).__length_hint__" "_len()" 
--compare-to=../cpython-master/venv/bin/python --duplicate=1
/home/sergey/tmp/cpython-master/venv/bin/python: . 19.4 ns 
+- 0.3 ns
/home/sergey/tmp/cpython-dev/venv/bin/python: . 17.3 ns +- 
0.1 ns
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 19.4 ns +- 
0.3 ns -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 17.3 ns +- 0.1 ns: 
1.12x faster (-11%)

$ python -m perf timeit -s "_len = iter(bytes(2**10)).__length_hint__" "_len()" 
--compare-to=../cpython-master/venv/bin/python --duplicate=1
/home/sergey/tmp/cpython-master/venv/bin/python: . 26.3 ns 
+- 0.1 ns
/home/sergey/tmp/cpython-dev/venv/bin/python: . 25.3 ns +- 
0.2 ns
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 26.3 ns +- 
0.1 ns -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 25.3 ns +- 0.2 ns: 
1.04x faster (-4%)

$ python -m perf timeit -s "_len = iter(bytes(2**30)).__length_hint__" "_len()" 
--compare-to=../cpython-master/venv/bin/python --duplicate=1
/home/sergey/tmp/cpython-master/venv/bin/python: ..... 27.6 ns 
+- 0.1 ns
/home/sergey/tmp/cpython-dev/venv/bin/python: . 26.0 ns +- 
0.1 ns
Mean +- std dev: [/home/sergey/tmp/cpython-master/venv/bin/python] 27.6 ns +- 
0.1 ns -> [/home/sergey/tmp/cpython-dev/venv/bin/python] 26.0 ns +- 0.1 ns: 
1.06x faster (-6%)

--
components: Interpreter Core
messages: 349285
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: micro-optimization of PyLong_FromSize_t()
type: performance
versions: Python 3.9

___
Python tracker 
<https://bugs.python.org/issue37802>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue37802] micro-optimization of PyLong_FromSize_t()

2019-08-09 Thread Sergey Fedoseev


Change by Sergey Fedoseev :


--
keywords: +patch
pull_requests: +14924
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/15192

___
Python tracker 
<https://bugs.python.org/issue37802>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue30856] unittest.TestResult.addSubTest should be called immediately after subtest finishes

2017-07-05 Thread Sergey Fedoseev

New submission from Sergey Fedoseev:

Currently TestResult.addSubTest() is called just before TestResult.stopTest(), 
but docs says that addSubTest is "Called when a subtest finishes". IMO that 
means that it will be called immediately after subtest finishes, but not after 
indefinite time.

Test is attached.

--
files: test_subtest.py
messages: 297756
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: unittest.TestResult.addSubTest should be called immediately after 
subtest finishes
type: behavior
Added file: http://bugs.python.org/file46990/test_subtest.py

___
Python tracker 
<http://bugs.python.org/issue30856>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue30856] unittest.TestResult.addSubTest should be called immediately after subtest finishes

2017-07-05 Thread Sergey Fedoseev

Changes by Sergey Fedoseev :


--
components: +Library (Lib)

___
Python tracker 
<http://bugs.python.org/issue30856>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue31059] asyncio.StreamReader.read hangs if n<0

2017-07-27 Thread Sergey Kostyuk

New submission from Sergey Kostyuk:

Good day

Maybe I misunderstood something, but I'm failed to fetch any data by calling 
asyncio.StreamReader.read if `n` is less than zero (or left default). It just 
hangs in the loop forever (see line number 614 of asyncio/streams.py: 
https://github.com/python/cpython/blob/3e56ff0/Lib/asyncio/streams.py#L614). If 
`n` is equal to any positive value - coroutine works as expected and returns if 
there is any data in socket buffer. Even if available data size is less than 
`n` bytes.

Expected behavior: collect all data from the buffer and return
Current behavior: hangs in the loop forever if n < 0

My usage sample: https://git.io/v7nJq

--
components: asyncio
messages: 299332
nosy: Sergey Kostyuk, yselivanov
priority: normal
severity: normal
status: open
title: asyncio.StreamReader.read hangs if n<0
type: behavior
versions: Python 3.5

___
Python tracker 
<http://bugs.python.org/issue31059>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue31108] add __contains__ for list_iterator (and others) for better performance

2017-08-02 Thread Sergey Fedoseev

Changes by Sergey Fedoseev :


--
pull_requests: +3025

___
Python tracker 
<http://bugs.python.org/issue31108>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue31108] add __contains__ for list_iterator (and others) for better performance

2017-08-02 Thread Sergey Fedoseev

New submission from Sergey Fedoseev:

> python -mtimeit -s "l = list(range(10))" "l[-1] in l"
1000 loops, best of 3: 1.34 msec per loop
> python -mtimeit -s "l = list(range(10))" "l[-1] in iter(l)"   
>  
1000 loops, best of 3: 1.59 msec per loop

--
messages: 299666
nosy: sir-sigurd
priority: normal
severity: normal
status: open
title: add __contains__ for list_iterator (and others) for better performance
type: performance

___
Python tracker 
<http://bugs.python.org/issue31108>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



  1   2   3   >