On 11/12/2019 3:56 pm, Guido van Rossum wrote:
On Wed, Dec 11, 2019 at 2:14 AM Mark Shannon <m...@hotpy.org
<mailto:m...@hotpy.org>> wrote:
If the status quo were the result of considered decision, then it would
of course need considerable justification.
If, as is the case here, the status quo is a result of historical
accident and/or implementation details, then I think a weaker
justification is OK.
Whoa. The lack of limits in the status quo (no limits on various things
except indirectly, through available memory) is most definitely the
result of an intentional decision. "No arbitrary limits" was part of
Python's initial design philosophy. We didn't always succeed (parse tree
depth and call recursion depth come to mind) but that was definitely the
philosophy. It was in contrast to other languages that did have
arbitrary limits (e.g. Pascal's 255-char limit on strings, or C's
machine-dependent integer size) and in several cases the implementation
went through great lengths to avoid limits (e.g. we could have avoided a
lot of dynamic memory (re)allocation if we'd limited line lengths or
file sizes).
Sure, there are few *designed* in limits in Python at the moment. But in
reality Python has lots of limits.
Because CPython is the reference implementation, Python is what CPython
does; and CPython has lots of limits.
Classes.
--------
One suggested way, by which a million classes might be created was using
namedtuple:
>>> from collections import namedtuple
>>> l = [ namedtuple(f"nt{i}", "a,b,c,d,e,f,g,h,i,j") for i in
range(1000_000) ]
This takes 6Gb of resident memory on my machine (3.9alpha).
Most machines have roughly 4Gb per core (if you buy a bigger machine you
get more CPUs and more RAM, roughly in that proportion).
So RAM effectively limits the number of classes to less than one million
already (assuming you want to use your cores efficiently).
Imposing a limit on the number of classes and reducing memory footprint
would allow more classes in practice, and would allow a lot more objects
in more sensible programs.
Instructions per code-object
----------------------------
CPython will crash if this is in the 2**31 to 2**32 range as the
compiler treats addresses as unsigned, but the interpreter treats them
as signed.
Obviously this can be fixed, but it is an example of the sort of lurking
bug that implicit limits can cause. These things are very expensive to
test as you need a machine with hundreds of gigabytes of memory.
Explicit limits are much easier to test. Does code outside the limit
fail in the expected fashion and code just under the limit work correctly?
What I want, is to allow more efficient use of resources without
inconveniently low or unspecified limits. There will always be some
limits on finite machines. If they aren't specified, they still exist,
we just don't know what they are or how they will manifest themselves.
Cheers,
Mark.
You have an extreme need to justify why we should change now. "An
infinite number of potential optimizations" does not cut it.
--
--Guido van Rossum (python.org/~guido <http://python.org/~guido>)
/Pronouns: he/him //(why is my pronoun here?)/
<http://feministing.com/2015/02/03/how-using-they-as-a-singular-pronoun-can-change-the-world/>
_______________________________________________
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at
https://mail.python.org/archives/list/python-dev@python.org/message/XVSDNGJ7YMUHG2XV5TGBY7WJ56KOEHDK/
Code of Conduct: http://python.org/psf/codeofconduct/