Re: RFC: Proposal: Deterministic Object Destruction
On Sunday, 4 March 2018 04:23:07 UTC, Steven D'Aprano wrote: > On Sat, 03 Mar 2018 18:19:37 -0800, Ooomzay wrote: > > >> def function(): > >> x = open_resource() > >> process(x) > >> # and we're done with x now, but too lazy to explicitly close it > >> sleep(1) # Simulate some more work. Lots of work. > >> return > >> # and finally x is closed (2.8 hours after you finished using it) > >> > >> The answer in C++ is "well don't do that then". The answer is Python > >> is, "don't be so lazy, just use a with statement". > > > > The answer in C++ would be to say "don't be so lazy just put the x > > manipulation in a function or sub-block". > > Right -- so you still have to think about resource management. The > premise of this proposal is that RAII means you don't have to think about > resource management, it just happens, but that's not really the case. That is not my main premise: Which is that RAII is a more elegant (no specialised syntax at all) and robust way to manage resources. Please consider the case of a composite resource: You need to implement __enter__, __exit__ and track the open/closed state at every level in your component hierarchy - even if some levels hold no resources directly. This is burdensome, breaks encapsulation, breaks invariance and is error prone ...very unpythonic. > > The answer with Python + this > > PEP would be "don't be so lazy just put the x manipulation in a function > > or explicitly del x" ...no new syntax. > > Sure -- but it doesn't gain you anything we don't already have. See above. > It imposes enormous burdens on the maintainers of at least five > interpreters (CPython, Stackless, Jython, IronPython, PyPy) all of which > will need to be re-written to have RAII semantics guaranteed; Yes. This is a substantial issue that will almost certainly see it rejected by HWCNBN on political, rather than linguistic, grounds. My PEP is about improving the linguistic integrity and fitness for resource management purpose of the language. > it will > probably have significant performance costs; and at the end of the day, > the benefit over using a with statement is minor. > > (Actually, I'm not convinced that there is *any* benefit. If anything, I > think it will be a reliability regression -- see below.) > > >> If you want deterministic closing of resources, with statements are the > >> way to do it. > >> > >> def function(): > >> with open_resource() as x: > >> process(x) > >> # and x is guaranteed to be closed > > > > What a palava! > > I don't know that word, and neither do any of my dictionaries. I think > the word you might mean is "pelaver"? I don't know that word, and neither do any of my dictionaries. I think the word you might mean is "Palaver"? Anyway Palava/Pelaver/Palaver/Palavra/Palabra derives from the word for "word", but in England it is often used idiomatically to mean a surfeit of words, or even more generally, a surfeit of effort. I intend it in both senses: The unnecessary addition of the words "with", "as", "__enter__" & "__exit__" to the language and the need implement the latter two methods all over the place. > In any case, you might not like with statements, but I think they're > infinitely better than: > > def meaningless_function_that_exists_only_to_manage_resource(): > x = open_resource() > process(x) > def function(): > meaningless_function_that_exists_only_to_manage_resource() > sleep(1) # simulate a long-running function Why would you prefer a new construct? Functions _are_ pythons scoping context! Giving one a pejorative name does not change that. > In other words, your solutions are just as much manual resource > management as the with statement. The only differences are: > > - instead of explicitly using a dedicated syntax designed for > resource management, you're implicitly using scope behaviour; Excellent: With the benefit of automatic, exception safe, destruction of resources, including composite resources. > - the with block is equivalent to a try...finally, and so it is > guaranteed to close the resource even if an exception occurs; > your solution isn't. > If process(x) creates a non-local reference to x, and then raises an > exception, and that exception is caught elsewhere, x will not go out of > scope and won't be closed. > A regression in the reliability of the code. This PEP does not affect existing code. Peeps who are familiar with RAII understand that creating a global reference to an RAII resource is explicitly saying "I want this kept open at global scope" and that is the behaviour that they will be guaranteed. > (I'm moderately confident that this failure of RAII to deliver what it > promises is possible in C++ too.) (Well I bet the entire VOIP stack, AAA and accounting modules in a certain 9 satellite network's base-stations against you) -- https://mail.python.org/mailman/listinfo/python-
Re: RFC: Proposal: Deterministic Object Destruction
On Sunday, 4 March 2018 04:23:07 UTC, Steven D'Aprano wrote: > [...] > [This PEP] imposes enormous burdens on the maintainers of at least five > interpreters (CPython, Stackless, Jython, IronPython, PyPy) all of which > will need to be re-written to have RAII semantics guaranteed; Not so:- CPython, the reference interpreter, already implements the required behaviour, as mentioned in the PEP. -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Sunday, 4 March 2018 03:16:31 UTC, Paul Rubin wrote: > Chris Angelico writes: > > Yep, cool. Now do that with all of your smart pointers being on the > > heap too. You are not allowed to use ANY stack objects. ANY. Got it? > > That's both overconstraining and not even that big a problem the way you > phrase it. > > 1) Python has both the "with" statement and try/finally. Both of these > run code at the exit from a syntactically defined scope. So they are > like stack allocation in C++, where a deallocator can run when the scope > exits. > > 2) Even with no scope-based de-allocation, it's common to put smart > pointers into containers like lists and vectors. So you could have a > unique_ptr to a filestream object, and stash the unique_ptr someplace as > a vector element, where the vector itself could be part of some even > more deeply nested structure. At some point, the big structure gets > deleted (maybe through a manually-executed delete statement). When that > happens, if the nested structures are all standard containers full of > unique_ptrs, the top-level finalizer will end up traversing the entire > tree and freeing up the file handles and whatever else might be in > there. > > It occurs to me, maybe #2 above is closer to what the OP is really after > in Python. Yep. C++ smart pointers are a good analogue to python references for purposes of this PEP. > I guess it's doable, but refcounts don't seem like the right > way. Well refcounts are definitely "doable": This is how the reference python implementation, CPython, currently manages to comply with this PEP and can therefore be used for RAII. This PEP is an attempt to _guarantee_ this behaviour and make the elegance of RAII available to all pythonistas that want it. Without this guarantee python is not attractive to applications that must manage non-trivial resources reliably. Aside: I once read somewhere that must have seemed authoritative at the time, that CPython _guarantees_ to continue to behave like this - but now the subject is topical again I can find no trace of this guarantee. -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Mon, Mar 5, 2018 at 12:26 AM, Ooomzay wrote: > Well refcounts are definitely "doable": This is how the reference python > implementation, CPython, currently manages to comply with this PEP and can > therefore be used for RAII. > > This PEP is an attempt to _guarantee_ this behaviour and make the elegance > of RAII available to all pythonistas that want it. Without this guarantee > python is not attractive to applications that must manage non-trivial > resources reliably. > > Aside: I once read somewhere that must have seemed authoritative at the > time, that CPython _guarantees_ to continue to behave like this - but now the > subject is topical again I can find no trace of this guarantee. That's because there is no such guarantee. In fact, you can probably find places in the docs where it is specifically stated that you cannot depend on __del__ for this. You still haven't said how you're going to cope with reference cycles - or are you expecting every single decref to do a full cycle check? ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Sun, 04 Mar 2018 04:37:46 -0800, Ooomzay wrote: > On Sunday, 4 March 2018 04:23:07 UTC, Steven D'Aprano wrote: >> [...] >> [This PEP] imposes enormous burdens on the maintainers of at least five >> interpreters (CPython, Stackless, Jython, IronPython, PyPy) all of >> which will need to be re-written to have RAII semantics guaranteed; > > Not so:- CPython, the reference interpreter, already implements the > required behaviour, as mentioned in the PEP. Except that it doesn't. From the docs: "It is not guaranteed that __del__() methods are called for objects that still exist when the interpreter exits." https://docs.python.org/3/reference/datamodel.html#object.__del__ That limitation of CPython is much reduced now compared to older versions, but there are still circumstances where the __del__ method cannot be called. -- Steve -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Sunday, 4 March 2018 03:00:13 UTC, Chris Angelico wrote: > This thread is dead. The OP wants to wave a magic wand and say > "__del__ is now guaranteed to be called immediately", No "magic" required: Just one line change in the language reference will do it. > without any explanation The PEP says it all really: To make the very pythonic RAII idiom available in python. > - and, from the look of things, without any understanding > - of what that means for the language What impact on the _language_ (c.f. interpreter) do you think I have not understood? It is is 100% backwards compatible with the language. It breaks nothing. It allows people who want to use the RAII idiom to do so. It allows people who want to use the "with" idiom to continue do so. > and the interpreters. I am well aware of what it will mean for interpreters. For some interpreters it will have zero impact (e.g. CPython) and for some others it would unlikely be economic to make them comply. The decision here is does python want to be more pythonic, and make itself attractive for resource management applications or does it want to be compromised by some implementations? > Everyone else is saying "your magic wand is broken". This is not going to go > anywhere. Well I see a lot of posts that indicate peeps here are more comfortable with the "with" idiom than the RAII idiom but I have not yet seen a single linguistic problem or breakage. As it happens I have used RAII extensively with CPython to manage a debugging environment with complex external resources that need managing very efficiently. (I would use C++ if starting from scratch because it _guarantees_ the required deterministic destruction whereas python does not) -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Mon, Mar 5, 2018 at 1:11 AM, Ooomzay wrote: > On Sunday, 4 March 2018 03:00:13 UTC, Chris Angelico wrote: >> This thread is dead. The OP wants to wave a magic wand and say >> "__del__ is now guaranteed to be called immediately", > > No "magic" required: Just one line change in the language reference will do > it. Go ahead and actually implement it. It's not just one line in the language reference. >> without any explanation > > The PEP says it all really: To make the very pythonic RAII idiom available in > python. > >> - and, from the look of things, without any understanding >> - of what that means for the language > > What impact on the _language_ (c.f. interpreter) do you think I have not > understood? > > It is is 100% backwards compatible with the language. It breaks nothing. > > It allows people who want to use the RAII idiom to do so. > > It allows people who want to use the "with" idiom to continue do so. > >> and the interpreters. > > I am well aware of what it will mean for interpreters. For some interpreters > it will have zero impact (e.g. CPython) and for some others it would unlikely > be economic to make them comply. > I don't even understand you now. First off, you just acknowledged that this WILL impact CPython - you're not simply mandating what CPython already does, you're tightening up the rules significantly. For others - what do you even mean, "unlikely be economic to make them comply"? Are you saying that the Jython project should simply die? Or that it should become non-compliant with the Python spec? Huh? ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On 3/4/18 8:26 AM, Ooomzay wrote: On Sunday, 4 March 2018 03:16:31 UTC, Paul Rubin wrote: Chris Angelico writes: Yep, cool. Now do that with all of your smart pointers being on the heap too. You are not allowed to use ANY stack objects. ANY. Got it? That's both overconstraining and not even that big a problem the way you phrase it. 1) Python has both the "with" statement and try/finally. Both of these run code at the exit from a syntactically defined scope. So they are like stack allocation in C++, where a deallocator can run when the scope exits. 2) Even with no scope-based de-allocation, it's common to put smart pointers into containers like lists and vectors. So you could have a unique_ptr to a filestream object, and stash the unique_ptr someplace as a vector element, where the vector itself could be part of some even more deeply nested structure. At some point, the big structure gets deleted (maybe through a manually-executed delete statement). When that happens, if the nested structures are all standard containers full of unique_ptrs, the top-level finalizer will end up traversing the entire tree and freeing up the file handles and whatever else might be in there. It occurs to me, maybe #2 above is closer to what the OP is really after in Python. Yep. C++ smart pointers are a good analogue to python references for purposes of this PEP. I guess it's doable, but refcounts don't seem like the right way. Well refcounts are definitely "doable": This is how the reference python implementation, CPython, currently manages to comply with this PEP and can therefore be used for RAII. Are you including cyclic references in your assertion that CPython behaves as you want? --Ned. -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On 3/4/18 7:37 AM, Ooomzay wrote: On Sunday, 4 March 2018 04:23:07 UTC, Steven D'Aprano wrote: [...] [This PEP] imposes enormous burdens on the maintainers of at least five interpreters (CPython, Stackless, Jython, IronPython, PyPy) all of which will need to be re-written to have RAII semantics guaranteed; Not so:- CPython, the reference interpreter, already implements the required behaviour, as mentioned in the PEP. Except for cycles. --Ned. -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Sun, 04 Mar 2018 03:37:38 -0800, Ooomzay wrote:
> Please consider the case of a composite resource: You need to implement
> __enter__, __exit__ and track the open/closed state at every level in
> your component hierarchy - even if some levels hold no resources
> directly.
I'm sorry, that description is too abstract for me to understand. Can you
give a simple example?
> This is burdensome, breaks encapsulation, breaks invariance and is error
> prone ...very unpythonic.
Without a more concrete example, I cannot comment on these claims.
[...]
> My PEP is about improving the linguistic integrity and fitness for
> resource management purpose of the language.
So you claim, but *requiring* reference counting semantics does not
improve the integrity or fitness of the language. And the required
changes to programming styles and practices (no cycles, no globals, put
all resources inside their own scope) demonstrate that this is a step
backwards.
Right now I can write reliable code that uses external resources (such as
a database connection or file) and put it in my application's module
scope, and still easily manage the resource lifetime. I cannot do that by
relying only on RAII. That's a step backwards as far as language fitness.
>> In any case, you might not like with statements, but I think they're
>> infinitely better than:
>>
>> def meaningless_function_that_exists_only_to_manage_resource():
>> x = open_resource()
>> process(x)
>
>> def function():
>> meaningless_function_that_exists_only_to_manage_resource()
>> sleep(1) # simulate a long-running function
>
> Why would you prefer a new construct?
I don't prefer a new construct. The "with" statement isn't "new". It goes
back to Python 2.5 (`from __future__ import with_statement`) which is
more than eleven years old now. That's about half the lifetime of the
language!
I prefer the with statement because it is *explicit*, simple to use, and
clear to read. I can read some code and instantly see that when the with
block ends, the resource will be closed, regardless of how many
references to the object still exist.
I don't have to try to predict (guess!) when the last reference will go
out of scope, because that's irrelevant.
RAII conflates the lifetime of the object with the lifetime of the
resource held by the object. They are not the same, and the object can
outlive the resource.
Your position is:
"RAII makes it really elegant to close the file! All you need to do is
make sure that when you want to close the file, you delete all the
references to the file, so that it goes out of scope, and the file will
be closed."
My position is:
"If I want to close the file, I'll just close the file. Why should I care
that there are zero or one or a million references to it?"
>> - the with block is equivalent to a try...finally, and so it is
>> guaranteed to close the resource even if an exception occurs; your
>> solution isn't.
>
>> If process(x) creates a non-local reference to x, and then raises an
>> exception, and that exception is caught elsewhere, x will not go out of
>> scope and won't be closed.
>> A regression in the reliability of the code.
>
> This PEP does not affect existing code. Peeps who are familiar with RAII
> understand that creating a global reference to an RAII resource is
> explicitly saying "I want this kept open at global scope" and that is
> the behaviour that they will be guaranteed.
I'm not talking about global scope. Any persistent reference to the
object will prevent the resource from being closed.
Here's a proof of concept which demonstrates the problem with conflating
object scope and resource lifetime. Notice that there are no globals used.
def main():
from time import sleep
values = []
def process():
f = open('/tmp/foo', 'w')
values.append(f)
f.write("Hello world!")
f.read() # oops!
del values[-1]
try:
process()
except IOError:
pass
# The file should be closed now. But it isn't.
sleep(10) # simulate doing a lot of work
g = open('/tmp/foo', 'r')
assert g.read() == "Hello world!"
The assertion fails, because the file hasn't be closed in a timely
manner. On the other hand:
def main2():
from time import sleep
values = []
def process():
with open('/tmp/bar', 'w') as f:
values.append(f)
f.write("Hello world!")
f.read() # oops!
del values[-1]
try:
process()
except IOError:
pass
sleep(10) # simulate doing a lot of work
g = open('/tmp/foo', 'r')
assert g.read() == "Hello world!"
The assertion here passes.
Now, these are fairly contrived examples, but in real code the resource
owner might be passed into an iterator, or bound to a class attribute, or
anything else that holds onto a reference to it. As soon as that happens,
and there's another reference to the object any
Re: RFC: Proposal: Deterministic Object Destruction
On Sun, 04 Mar 2018 09:37:34 -0500, Ned Batchelder wrote: > On 3/4/18 7:37 AM, Ooomzay wrote: >> On Sunday, 4 March 2018 04:23:07 UTC, Steven D'Aprano wrote: >>> [...] >>> [This PEP] imposes enormous burdens on the maintainers of at least >>> five interpreters (CPython, Stackless, Jython, IronPython, PyPy) all >>> of which will need to be re-written to have RAII semantics guaranteed; >> Not so:- CPython, the reference interpreter, already implements the >> required behaviour, as mentioned in the PEP. > > Except for cycles. And during interpreter shutdown. -- Steve -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On 3/4/18 9:11 AM, Ooomzay wrote: I am well aware of what it will mean for interpreters. For some interpreters it will have zero impact (e.g. CPython) ... There's no point continuing this if you are just going to insist on falsehoods like this. CPython doesn't currently do what you want, but you are choosing to ignore the cases where it doesn't. You have to deal with cycles. --Ned. -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On 04/03/2018 14:11, Ooomzay wrote: Well I see a lot of posts that indicate peeps here are more comfortable with the "with" idiom than the RAII idiom but I have not yet seen a single linguistic problem or breakage. As it happens I have used RAII extensively with CPython to manage a debugging environment with complex external resources that need managing very efficiently. I have bit of problem with features normally used for the housekeeping of a language's data structures being roped in to control external resources that it knows nothing about. Which also means that if X is a variable with the sole reference to some external resource, then a mere: X = 0 will close, destroy, or discard that resource. If the resource is non-trivial, then it perhaps deserves a non-trivial piece of code to deal with it when it's no longer needed. It further means that when you did want to discard an expensive resource, then X going out of scope, calling del X or whatever, will not work if a copy of X still exists somewhere. -- bartc -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On 3/4/18 8:46 AM, Chris Angelico wrote: On Mon, Mar 5, 2018 at 12:26 AM, Ooomzay wrote: Well refcounts are definitely "doable": This is how the reference python implementation, CPython, currently manages to comply with this PEP and can therefore be used for RAII. This PEP is an attempt to _guarantee_ this behaviour and make the elegance of RAII available to all pythonistas that want it. Without this guarantee python is not attractive to applications that must manage non-trivial resources reliably. Aside: I once read somewhere that must have seemed authoritative at the time, that CPython _guarantees_ to continue to behave like this - but now the subject is topical again I can find no trace of this guarantee. That's because there is no such guarantee. In fact, you can probably find places in the docs where it is specifically stated that you cannot depend on __del__ for this. You still haven't said how you're going to cope with reference cycles - or are you expecting every single decref to do a full cycle check? ChrisA My presumption of the proposal is the it does NOT expect that __del__ be called just because an object is no longer reachable but is in a cycle of isolated objects, those would still need to wait for the garbage collector to get them. The request is that when the direct reference count goes to 0, that __del__ gets called. I think that is what CPython promises (but not other versions). I am not positive if __del__ gets called for sure when the object is garbage collected (I thought it did, but I am not positive). I am pretty sure it does NOT get called on object that are still in existence when things terminate (which would be the major difference from a language like C++) What the language does not promise is that in general, __del__ be called 'immediately' on the last reference going away in the general case, because CPythons reference counting is an implementation detail. My understanding of this proposal is to ask that, I would say at least for selected objects, that all implementations perform this reference counting, allowing objects that track 'critical resources' to get disposed of in a timely manner and not wait for garbage collection. And that this is triggered only by the reference count going to zero, and not if the object happens to be in an isolated reference cycle. This does limit what you can do with this sort of object, but that normally isn't a problem. -- Richard Damon -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Mon, Mar 5, 2018 at 9:09 AM, Richard Damon wrote: > My presumption of the proposal is the it does NOT expect that __del__ be > called just because an object is no longer reachable but is in a cycle of > isolated objects, those would still need to wait for the garbage collector > to get them. The request is that when the direct reference count goes to 0, > that __del__ gets called. > > I think that is what CPython promises (but not other versions). > > I am not positive if __del__ gets called for sure when the object is garbage > collected (I thought it did, but I am not positive). > > I am pretty sure it does NOT get called on object that are still in > existence when things terminate (which would be the major difference from a > language like C++) > > What the language does not promise is that in general, __del__ be called > 'immediately' on the last reference going away in the general case, because > CPythons reference counting is an implementation detail. > > My understanding of this proposal is to ask that, I would say at least for > selected objects, that all implementations perform this reference counting, > allowing objects that track 'critical resources' to get disposed of in a > timely manner and not wait for garbage collection. And that this is > triggered only by the reference count going to zero, and not if the object > happens to be in an isolated reference cycle. This does limit what you can > do with this sort of object, but that normally isn't a problem. Okay, that sounds reasonable. Let's help things out by creating a special syntax for reference-counted object usage, to allow other implementations to use different forms of garbage collection. When you acquire these kinds of objects, you "mark" them with this special syntax, and Python will call a special method on that object to say "hey, you're now in use". Then when that special syntax is done, Python calls another special method to say "hey, you're not in use any more". Something like this: using some_object: ... ... ... # at the unindent, we're automatically not using it That would call a special method some_object.__now_using__() at the top of the block, and some_object.__not_using__() at the bottom. Or, if the block exits because of an exception, we could call some_object.__not_using(exc) with the exception details. I think this could be a really good feature - it'd allow non-refcounted Python implementations to have a strong guarantee of immediate disposal of a managed resource, and it'd also strengthen the guarantee for refcounted Pythons too. Feel free to bikeshed the exact keywords and function names, of course. ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Sunday, 4 March 2018 14:37:30 UTC, Ned Batchelder wrote: > Are you including cyclic references in your assertion that CPython > behaves as you want? Yes. Because the only behaviour required for RAII is to detect and debug such cycles in order to eliminate them. It is a design error/resource leak to create an orphan cycle containing RAII objects. def main(): gc,disable -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Mon, 05 Mar 2018 09:20:24 +1100, Chris Angelico wrote: > Okay, that sounds reasonable. Let's help things out by creating a > special syntax for reference-counted object usage [...] > Feel free to bikeshed the exact keywords and function names, of course. I see what you did there. :-) -- Steve -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Mon, Mar 5, 2018 at 10:42 AM, Steven D'Aprano wrote: > On Mon, 05 Mar 2018 09:20:24 +1100, Chris Angelico wrote: > >> Okay, that sounds reasonable. Let's help things out by creating a >> special syntax for reference-counted object usage > [...] >> Feel free to bikeshed the exact keywords and function names, of course. > > > I see what you did there. > > :-) > Yeah I'm not sure if the OP will see what I did, though... ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On 3/4/18 5:25 PM, Ooomzay wrote: On Sunday, 4 March 2018 14:37:30 UTC, Ned Batchelder wrote: Are you including cyclic references in your assertion that CPython behaves as you want? Yes. Because the only behaviour required for RAII is to detect and debug such cycles in order to eliminate them. It is a design error/resource leak to create an orphan cycle containing RAII objects. def main(): gc,disable This isn't a reasonable position. Cycles exist, and the gc exists for a reason. Your proposal isn't going to go anywhere if you just naively ignore cycles. --Ned. -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On 3/4/2018 1:37 PM, Ooomzay wrote: Not so:- CPython, the reference interpreter, already implements the required behaviour, as mentioned in the PEP. It does most of the time, but it's not guaranteed. See my previous post. Regards, Dietmar -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Sun, Mar 4, 2018 at 10:37 PM, Ooomzay wrote: > Please consider the case of a composite resource: You need to implement > __enter__, __exit__ and track the open/closed state at every level in > your component hierarchy - even if some levels hold no resources directly. > > This is burdensome, breaks encapsulation, breaks invariance and is error prone > ...very unpythonic. Why do you need to? I don't understand your complaint here - can you give an example of a composite resource that needs this kind of special management? ChrisA -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On 04/03/18 02:28, Ooomzay wrote:
On Friday, 2 March 2018 15:37:25 UTC, Paul Moore wrote:
[snip]
def fn():
for i in range(1):
with open(f"file{i}.txt", "w") as f:
f.write("Some text")
How would you write this in your RAII style - without leaving 10,000
file descriptors open until the end of the function?
def fn():
for i in range(1):
f = RAIIFile(f"file{i}.txt", "w")
f.write("Some text")
Over my dead body. Not that it matters as I can't see this happening in
a month of Sundays.
--
My fellow Pythonistas, ask not what our language can do for you, ask
what you can do for our language.
Mark Lawrence
--
https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Sunday, 4 March 2018 15:24:08 UTC, Steven D'Aprano wrote:
> On Sun, 04 Mar 2018 03:37:38 -0800, Ooomzay wrote:
>
> > Please consider the case of a composite resource: You need to implement
> > __enter__, __exit__ and track the open/closed state at every level in
> > your component hierarchy - even if some levels hold no resources
> > directly.
> > This is burdensome, breaks encapsulation, breaks invariance and is error
> > prone ...very unpythonic.
>
> Without a more concrete example, I cannot comment on these claims.
Here is an example of a composite resource using RAII:-
class RAIIFileAccess():
def __init__(self, fname):
print("%s Opened" % fname)
def __del__(self):
print("%s Closed" % fname)
class A():
def __init__(self):
self.res = RAIIFileAccess("a")
class B():
def __init__(self):
self.res = RAIIFileAccess("b")
class C():
def __init__(self):
self.a = A()
self.b = B()
def main():
c = C()
Under this PEP this is all that is needed to guarantee that the files "a"
and "b" are closed on exit from main or after any exception has been handled.
Also note that if you have a reference to these objects then they are
guaranteed to be in a valid/useable/open state (invariant) - no danger
or need to worry/check about enter/exit state.
Now repeat this example with "with".
> [...]
> > My PEP is about improving the linguistic integrity and fitness for
> > resource management purpose of the language.
>
> So you claim, but *requiring* reference counting semantics does not
> improve the integrity or fitness of the language.
We will just have to disagree on that for now.
> And the required
> changes to programming styles and practices (no cycles,
No change required. But if you _choose_ to benefit from RAII you had better
not create orphan cycles with RAII objects in them, as that
is clearly a resource leak.
> no globals,
No change required. But if you _choose_ to benefit from RAII you had better
take care to delete any RAII resources you choose to hold at global scope in
a robust way. (These are exceptional in my experience).
> put all resources inside their own scope)
No change required. But if you _choose_ to benefit from RAII you can make use
of python's existing scopes (functions) or del to restrict resource lifetimes.
> >> In any case, you might not like with statements, but I think they're
> >> infinitely better than:
> >>
> >> def meaningless_function_that_exists_only_to_manage_resource():
> >> x = open_resource()
> >> process(x)
> >
> >> def function():
> >> meaningless_function_that_exists_only_to_manage_resource()
> >> sleep(1) # simulate a long-running function
> >
> > Why would you prefer a new construct?
>
> I don't prefer a new construct. The "with" statement isn't "new". It goes
> back to Python 2.5 (`from __future__ import with_statement`) which is
> more than eleven years old now. That's about half the lifetime of the
> language!
>
> I prefer the with statement because it is *explicit*, simple to use, and
> clear to read. I can read some code and instantly see that when the with
> block ends, the resource will be closed, regardless of how many
> references to the object still exist.
>
> I don't have to try to predict (guess!) when the last reference will go
> out of scope, because that's irrelevant.
If you don't care about what the other references might be then
RAII is not for you. Fine.
> RAII conflates the lifetime of the object with the lifetime of the
> resource held by the object.
This "conflation" is called "invariance" and is usually considered a
"very good thing" as you cant have references floating around to
half-baked resources.
> They are not the same, and the object can
> outlive the resource.
Not with RAII it can't. Simple. Good.
> Your position is:
>
> "RAII makes it really elegant to close the file! All you need to do is
> make sure that when you want to close the file, you delete all the
> references to the file, so that it goes out of scope, and the file will
> be closed."
>
> My position is:
>
> "If I want to close the file, I'll just close the file. Why should I care
> that there are zero or one or a million references to it?"
Because if you have no idea what references there are you can not assume it
is OK to close the file! That would be a truly terrible program design.
> >> - the with block is equivalent to a try...finally, and so it is
> >> guaranteed to close the resource even if an exception occurs; your
> >> solution isn't.
It is: RAII will release all resources held, transitively. Try the example
above using CPython and put an exception in one of the constructors.
> >> If process(x) creates a non-local reference to x, and then raises an
> >> exception, and that exception is caught elsewhere, x will not go out of
> >> scope and won't be closed.
> >> A regression in the reliability of the code
Re: RFC: Proposal: Deterministic Object Destruction
On Sunday, 4 March 2018 23:57:24 UTC, Mark Lawrence wrote:
> On 04/03/18 02:28, Ooomzay wrote:
> > On Friday, 2 March 2018 15:37:25 UTC, Paul Moore wrote:
> > [snip]
> >> def fn():
> >> for i in range(1):
> >> with open(f"file{i}.txt", "w") as f:
> >> f.write("Some text")
> >>
> >> How would you write this in your RAII style - without leaving 10,000
> >> file descriptors open until the end of the function?
> >
> > def fn():
> > for i in range(1):
> > f = RAIIFile(f"file{i}.txt", "w")
> > f.write("Some text")
> >
> Over my dead body.
Care to expand on that?
--
https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Sunday, 4 March 2018 23:55:33 UTC, Ned Batchelder wrote: > On 3/4/18 5:25 PM, Ooomzay wrote: > > On Sunday, 4 March 2018 14:37:30 UTC, Ned Batchelder wrote: > >> Are you including cyclic references in your assertion that CPython > >> behaves as you want? > > Yes. Because the only behaviour required for RAII is to detect and debug > > such cycles in order to eliminate them. It is a design error/resource leak > > to create an orphan cycle containing RAII objects. > > > This isn't a reasonable position. Cycles exist, and the gc exists for a > reason. Your proposal isn't going to go anywhere if you just naively > ignore cycles. I am not naively ignoring them. But _if_ you want to use RAII then do not leak them in cycles. Put anything else in there you like and gc them as before. -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On 3/4/18 6:55 PM, Ned Batchelder wrote: On 3/4/18 5:25 PM, Ooomzay wrote: On Sunday, 4 March 2018 14:37:30 UTC, Ned Batchelder wrote: Are you including cyclic references in your assertion that CPython behaves as you want? Yes. Because the only behaviour required for RAII is to detect and debug such cycles in order to eliminate them. It is a design error/resource leak to create an orphan cycle containing RAII objects. def main(): gc,disable This isn't a reasonable position. Cycles exist, and the gc exists for a reason. Your proposal isn't going to go anywhere if you just naively ignore cycles. --Ned. While Ooomzay seems to want to say that all cycles are bad, I think it is fair to say that in general in Python they are unavoidable, in part because I can't see any good way to create the equivalent of a 'weak reference' (names aren't objects so don't have properties). The best I can think of is to create some sort of magical object that can refer to another object, but that reference 'doesn't count'. This seems very unPythonic. What I think can be said is that it should be possible (enforced by the programmer, not the language) to use these RAII objects in ways that don't create cycles (or maybe that the program knows of the cycles and makes the effort to break them when it is important). So perhaps it can be said that cycle that involve major resource RAII objects should exist. -- Richard Damon -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Saturday, 3 March 2018 17:10:53 UTC, Dietmar Schwertberger wrote: > CPython does *not* guarantee destruction when the object reference goes > out of scope, even if there are no other references. > I would very much appreciate such a deterministic behaviour, at least > with CPython. > > I recently had to debug an issue in the matplotlib wx backend (*). Under > certain conditions, the wx device context was not destroyed when the > reference went out of scope. Adding a del to the end of the method or > calling the Destroy method of the context did fix the issue. (There was > also a hidden reference, but avoiding this was not sufficient. The del > was still required.) You say the reference was out of scope but that a del was still required. What were you delling if the reference was out of scope? Could you sketch the code. -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
Richard Damon writes: > […] I can't see any good way to create the equivalent of a 'weak > reference' (names aren't objects so don't have properties). The best > I can think of is to create some sort of magical object that can refer > to another object, but that reference 'doesn't count'. This seems very > unPythonic. Do you mean something like the standard library ‘weakref’ module https://docs.python.org/3/library/weakref.html>? The weakref module allows the Python programmer to create weak references to objects. […] A weak reference to an object is not enough to keep the object alive: when the only remaining references to a referent are weak references, garbage collection is free to destroy the referent and reuse its memory for something else. However, until the object is actually destroyed the weak reference may return the object even if there are no strong references to it. -- \ “The cost of education is trivial compared to the cost of | `\ ignorance.” —Thomas Jefferson | _o__) | Ben Finney -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On Monday, 5 March 2018 01:11:43 UTC, Richard Damon wrote: > On 3/4/18 6:55 PM, Ned Batchelder wrote: > > On 3/4/18 5:25 PM, Ooomzay wrote: > >> On Sunday, 4 March 2018 14:37:30 UTC, Ned Batchelder wrote: > >>> Are you including cyclic references in your assertion that CPython > >>> behaves as you want? > >> Yes. Because the only behaviour required for RAII is to detect and > >> debug such cycles in order to eliminate them. It is a design > >> error/resource leak to create an orphan cycle containing RAII objects. > >> > >> def main(): > >> gc,disable > >> > >> > >> > > > > This isn't a reasonable position. Cycles exist, and the gc exists for > > a reason. Your proposal isn't going to go anywhere if you just > > naively ignore cycles. > > > > --Ned. > > While Ooomzay seems to want to say that all cycles are bad, I only want to say that orphan cycles with RAII objects in them are bad. -- https://mail.python.org/mailman/listinfo/python-list
Re: RFC: Proposal: Deterministic Object Destruction
On 05/03/18 01:01, Ooomzay wrote:
On Sunday, 4 March 2018 23:57:24 UTC, Mark Lawrence wrote:
On 04/03/18 02:28, Ooomzay wrote:
On Friday, 2 March 2018 15:37:25 UTC, Paul Moore wrote:
[snip]
def fn():
for i in range(1):
with open(f"file{i}.txt", "w") as f:
f.write("Some text")
How would you write this in your RAII style - without leaving 10,000
file descriptors open until the end of the function?
def fn():
for i in range(1):
f = RAIIFile(f"file{i}.txt", "w")
f.write("Some text")
Over my dead body.
Care to expand on that?
Sure, when you state what you intend doing about reference cycles, which
you've been asked about countless times.
--
My fellow Pythonistas, ask not what our language can do for you, ask
what you can do for our language.
Mark Lawrence
--
https://mail.python.org/mailman/listinfo/python-list
