[Bug c++/71885] Incorrect code generated with -01, memset() function call is missing

2016-07-15 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71885

Kern Sibbald  changed:

   What|Removed |Added

 CC||kern at sibbald dot com

--- Comment #5 from Kern Sibbald  ---
I would like to see a precise explanation of what "undefined behavior" is. 
Bacula does zero memory that is passed as a replacement to a new.  This
behavior is desirable and very well defined. I may be wrong, but it seems that
you are assuming that the memory is already zero, which is not a good
assumption. In the case of Bacula memory that is allocated from the heap is
non-zero by explicit programming.  Consequently this seems to me to be an
overly aggressive optimization that should be enabled at a much higher
optimization level than -O2, generally used by Bacula and many other programs.

At some point g++'s tinkering with valid user's programs should stop.  Yes, we
can probably work around the g++ generated bug by explicitly testing for the
g++ compiler version and adding another command line option, but this should be
unnecessary.

Changing g++ in this manner (rather cavalier in my opinion) can create lots of
unnecessary problems for many programs. As you might image, I call this new g++
feature "unreasonable" and "dangerous".  Please remove it or at least make it a
new option that the g++ user must explicitly add.

[Bug c++/71892] New: Recent optimization changes introduce bugs

2016-07-15 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71892

Bug ID: 71892
   Summary: Recent optimization changes introduce bugs
   Product: gcc
   Version: 6.0
Status: UNCONFIRMED
  Severity: normal
  Priority: P3
 Component: c++
  Assignee: unassigned at gcc dot gnu.org
  Reporter: kern at sibbald dot com
  Target Milestone: ---

The optimizations that you made to g++ 6.0 are not reasonable since they create
programming bugs.  This applies to two optimizations you have made.  One
disallowing this to be tested against NULL and the second elimination of what
you determine to be unnecessary memory clearing with memset. 

In both case, the assumptions you make are not valid and thus these
optimizations should not be made at low levels such as -O2 and below.

First the case of testing this == NULL.  You state: "When optimizing, GCC now
assumes the this pointer can never be null, which is guaranteed by the language
rules."

This seems to be incorrect to me.  "this" is nothing but a pointer to the class
structure and at least in the case of non-virtual methods this can be zero, and
it is often useful for it to be zero, and it is even more useful for a class
implementer to be able to know if his method is being called with an
uninitialized class.  For example, if one has an optional list implemented as a
class(as is the case in Bacula), rather than testing everywhere in the code
whether or not the list pointer is NULL, it is more efficient to do so in the
class.  In our case, we return a zero in the size() method and thus there is no
need in perhaps hundreds of places to test for a NULL pointer because it is
done in the class.  Another more important use of testing for this == NULL is
for a class author to be able to know if the class has been initialized.  If
so, the method can continue, and if not the class writer can take appropriate
action and avoid a seg fault.  With your optimization, you explicitly remove
this feature from the C++ language.

The second case is you do not generate code in a overridden new operator when
memset() is used to zero the memory.  You are assuming that memory has already
been zeroed, but we are overridding new specifically to use our own memory
allocator that guarantees that the memory is non-zero.  While your optimization
may be appropriate at higher levels of optimization, it is not appropriate at
-O2 which is used by many programs.

I maintain that it is irresponsible to implement the above two unsafe
optimizations at levels -O2 and below.  I am fully aware that we could test for
the compiler level and add new command line options, but that is a lot of
unnecessary work, and in the mean time, g++ is not generating the code that
corresponds to what we wrote (i.e. it is generating incorrect code and
introducing seg faults in many programs).  Please implement these
"optimizations" if you must at higher optimization levels and leave levels -O2
and -O1 with only optimizations that are safe and do not modify the behavior of
the program as written.

[Bug c++/71885] Incorrect code generated with -01, memset() function call is missing

2016-07-15 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71885

--- Comment #7 from Kern Sibbald  ---
Just to be clear:

- This change might be appropriate for higher level of optimization, but is not
appropriate as a default for -O2 and -O1.

- There is no undefined behavior here. We override the new operator explicitly
to be able to allocate and manage the memory the way we want.  The g++ compiler
can make assumptions about what a new operator returns, but it should make any
assumptions about what we are doing to initialize memory inside an overridden
new operator particularly at low optimization levels.  The g++ behavior is
incorrect in this case.  Please correct it.

[Bug c++/71892] Recent optimization changes introduce bugs

2016-07-15 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71892

--- Comment #2 from Kern Sibbald  ---
Yes, we are aware of the option and how to fix the problem.  The issue is that
this optimization at low levels of -O1 and -O2 is not reasonable, and it is
unreasonable and irresponsible to make such changes. Just search Internet to
see what kinds of pains this creates -- all for nothing.  

Some years ago, we just accepted these kinds of crippling and costly changes.
This incident has caused bad versions of Bacula to be distributed by at least
two popular distros that seg fault because of g++.  We are not the only project
to be affected. For me this comes down to the philosophy of how the gcc project
treats its users. Do you force risky changes on your users or do you try to
protect them. The gcc project has its view, but this result is not acceptable
to me.  

Some years ago there was no alternative to g++, but faced with these kind of
problems that take months to fix because of an "unstable" and "incompatible"
new compiler releases, I for one will now take a careful look at the
alternatives.

I suggest that you (the gcc project) carefully reconsider whether making such
assumptions leading to risky optimizations is best practice.

[Bug c++/71885] Incorrect code generated with -01, memset() function call is missing

2016-07-15 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71885

--- Comment #10 from Kern Sibbald  ---
Thanks for your definition of "undetermined" behavior.  The problem here is
that at the time the compiler applied its optimization we were just in the
process of creating the object as is permitted by the C++ language, and in that
process the g++ compiler made what I consider to be a very risky assumption
about the state of the object while it was being created. I maintain that our
code is valid and should not be optimized away by the compiler. This, in my
opinion, is a case of the compiler writer not realizing that his assumption was
not valid in our particular case (object being created in an overridden new). 
The problem would be corrected in general by ensuring that no "unsafe"
optimizations are made at low levels of optimization.  At higher levels the
developer knows he risks problems.

[Bug c++/71885] Incorrect code generated with -01, memset() function call is missing

2016-07-15 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71885

--- Comment #12 from Kern Sibbald  ---
Yes, I clearly understand your point. My responses were meant for the project
were not directed at you. Hopefully someone will consider taking your advice of
not making this the default.  It may be difficult to backpeddle, but it is
better than breaking tens and possibly hundreds of projects.

[Bug c++/71892] Recent optimization changes introduce bugs

2016-07-16 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71892

--- Comment #6 from Kern Sibbald  ---
As you say, everything has been said and in any case, it is clear that you are
going to stick with the current compiler behavior. What you have failed to
understand is that I do very well understand that certain things are
"undefined". My complaint is that once you have decided that it is undefined,
the compiler should at print a warning message and then leave the code alone. 
This is what any reasonable compiler would do (IMO). Instead the compiler
disregards what the programmer has written and elides code. I (personally)
classify this this is bad decision making (not conservative) because it made
the "undefined behavior" worse by introducing bugs. Clearly this is my opinion
and not yours, so let's let it drop there because it is your project and your
call.

[Bug c++/71885] Incorrect code generated with -01, memset() function call is missing

2016-08-14 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71885

--- Comment #17 from Kern Sibbald  ---
It is pretty difficult to argue with the developers because they know the
"rules", better than most programmers.  However, here in my opinion they used
very poor judgement, by implementing a change that they were fully aware would
break many programs (they documented it as such).  

It is one thing to "force" C++ programmers to do what the developers think is
correct, but it is very bad thing to modify a compiler knowing that it will
break a lot of code.  There are probably thousands of users out there who are
now experiencing seg faults due to changes such as this.  

Many programmers such as myself rely on C++ but we prefer not to run on
bleeding edge systems, and we do not have the means to test our software on all
new systems and new compilers.  In this case, various bleeding edge distros
switch to gcc 6.0 and compile and release programs that seg fault out of the
box. These distros do not have the time or personnel to test every single
program. The result is that bad code is released.  This is and was unnecessary.
 Yes, we the programmers has options and can fix it.  What was unnecessary was
to release a new compiler that the developers knew would produce code that
fails.

The g++ developers could have realized that in especially in "undefined"
territory where they knew they would break code the conservative way to do it
without creating chaos is to add new strict warning message for a period of
time (one to two years) prior to making their changes default.

[Bug c++/71885] Incorrect code generated with -01, memset() function call is missing

2016-08-15 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71885

--- Comment #23 from Kern Sibbald  ---
In response to the last post of Jonathan Wakely.

Thanks, that is the first time someone on the gcc side has said something that
makes sense.  Of course, possibly I missed or misunderstood previous arguments.

That said, I still think the project made a bad decision on this one, and spent
time implementing something that has broken a lot of code and caused users (not
just developers) a lot of grief.

[Bug c++/71892] Recent optimization changes introduce bugs

2016-08-27 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71892

--- Comment #9 from Kern Sibbald  ---
(In reply to Manuel López-Ibáñez from comment #7)
Your wipppesnapper comments that are personally insulting are not at all
helpful.

[Bug c++/71892] Recent optimization changes introduce bugs

2016-08-27 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71892

--- Comment #10 from Kern Sibbald  ---
(In reply to anton from comment #8)
It is not productive or conductive to good relations for me to tell packagers
how to do their job.  The fact is that very few of them never test the packages
they release not because they are bad packagers but because they do not have
the time and perhaps the knowledge.

The kind of changes the gcc project is making these days are what gives open
source (free source if you like) a bad reputation.

[Bug c++/71892] Recent optimization changes introduce bugs

2016-08-27 Thread kern at sibbald dot com
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71892

--- Comment #11 from Kern Sibbald  ---
I recently discussed both of these "optimizations" with Bjarne Stroustrup and
his comment about deleting the memset() when overriding the new() functions
was:

   Looks like a bug to me

His comment about deleting the test for a NULL class pointer was:

  Yes. According to the standard no object can exists at address 0, so the
optimization seems valid.


So, I repeat: please remove your "bug" that deletes memset() code and causes
program failures.

Despite what he says about removing the NULL pointer, when you do so, you make
it impossible with standard C++ to prevent a seg fault is someone calls a class
function (by error or on purpose) with a NULL pointer.  Please rmove this
useless optimization.

Both of your optimizations do not make sense, they give no practical
improvement to the compiler and only serve to introduce failures to programs
that otherwise would run correctly.