On Wed, Dec 11, 2019 at 10:30:15PM -0500, Richard Damon wrote:

> I will way that in my many years of
> programming experience I can't think of any great cases where a language
> as part of the language definition limited to 'size' of a program to
> good effect,

Good thing that's not what the PEP proposes then :-)

The PEP talks about CPython implementation, not hard limits on all 
Python compilers/interpreters.


> and generally such limits relegate a language into being
> seen as a 'toy language'. 

The designers of C recognised that, in practice, compilers will have 
limits. Rather than demanding "NO ARBITRARY LIMITS!!!" they specified 
*minimum* levels for compliant compilers. Often quite low limits.

Actual compilers often impose their own limits:

https://gcc.gnu.org/onlinedocs/cpp/Implementation-limits.html

https://www.cs.auckland.ac.nz/references/unix/digital/AQTLTBTE/DOCU_012.HTM

So if Mark's proposal relegates Python to a "toy language", we'll be in 
good company with other "toys" that have implementation, or even 
language, limits:


https://stackoverflow.com/questions/5689798/why-does-java-limit-the-size-of-a-method-to-65535-byte

https://web.archive.org/web/20160304023522/http://programmers.stackexchange.com/questions/108699/why-does-javas-collection-size-return-an-int

https://www.sqlite.org/limits.html


(If you read only one of those five links, please read the last.)


> The biggest issue is that computers are
> growing more powerful every day, and programs follow in getting bigger,
> so any limit that we think of as more than sufficient soon becomes too
> small (No one will need more than 640k of RAM).

The beauty of this proposal is that since its an implementation limit, 
not a law of nature, if and when computers get more powerful and 
machines routinely have multiple zettabyte memories *wink* we can always 
update the implementation.

I'm not entirely being facetious here. There's a serious point. Unlike 
languages like C and Java, where changes have to go through a long, 
slow, difficult process, we have a much more agile process. If the PEP 
is accepted, that doesn't mean we're locked into that decision for life. 
Relaxing limits in the future doesn't break backwards compatibility.

"What if computers get more powerful? Our limits will be obsolete!!!" 
Naturally. Do you still expect to be using Python 3.9 in ten years time 
with your fancy new uber-hyper-quantum ten thousand gigabyte computer? 
Probably not. As hardware grows, and our needs grow, so can the 
hypothetical limits.

What is valuable are *concrete*, actual (not theoretical) examples of 
where Mark's proposed limits are too low, so that we can get a realistic 
view of where the potential tradeoffs lie:

* lose this much functionality (code that did run, or might have run, 
  but that won't run under the PEP)

* in order to gain this much in safety, maintainability, efficiency.


And before people jump down my throat again, I've already said -- on 
multiple occassions -- that the onus is on Mark to demonstrate the 
plausibility of any such gains.


Thank you for reading :-)



-- 
Steven
_______________________________________________
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/J4YNTDMOHXDR46JMRTX6BIZAWR4NL72Q/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to