On 08/30/2013 05:06 AM, Paolo Carlini wrote:
I should have explained some of that in better detail. The main issue I
had yesterday, is that the pattern matching can easily become very
difficult if not impossible: if you look at the second half of
expand_default_init, in some cases we wrap the ret
Hi again,
On 08/30/2013 11:06 AM, Paolo Carlini wrote:
I could, for example pass down a separate bit, instead of playing
again with the LOOKUP_* bits. At some point yesterday I even had that
version tested ;)
In practice, something like the attached.
By the way, as regards this comment in cp-
Hi,
On 08/29/2013 09:40 PM, Jason Merrill wrote:
On 08/29/2013 11:24 AM, Paolo Carlini wrote:
+ if ((complain & tf_error)
+ && (flags & LOOKUP_DELEGATING_CONS)
+ && name == complete_ctor_identifier
+ && TREE_CODE (ret) == CALL_EXPR
+ && (DECL_ABSTRACT_ORIGIN (TREE_OPERAND (
On 08/29/2013 11:24 AM, Paolo Carlini wrote:
+ if ((complain & tf_error)
+ && (flags & LOOKUP_DELEGATING_CONS)
+ && name == complete_ctor_identifier
+ && TREE_CODE (ret) == CALL_EXPR
+ && (DECL_ABSTRACT_ORIGIN (TREE_OPERAND (CALL_EXPR_FN (ret), 0))
+ == current_functi
Hi,
thus I have this simple patch which at least catches pure
self-delegation (no cycles). Better than nothing, I would say, given its
simplicity ;)
At first I thought I would put the check in expand_default_init but then
I noticed that in case of, eg, virtual bases, the simple pattern
matc