Am Samstag, den 15.01.2022, 16:38 -0500 schrieb Paul Koning:
> > On Jan 15, 2022, at 4:28 PM, Martin Sebor <mse...@gmail.com> wrote:
> > 
> > On 1/14/22 07:58, Paul Koning via Gcc wrote:
> > > > On Jan 14, 2022, at 9:15 AM, Michael Matz via Gcc <gcc@gcc.gnu.org> 
> > > > wrote:
> > > > 
> > > > > ...
> > > > But right now that's equivalent to making it observable,
> > > > because we don't have any other terms than observable or
> > > > undefined.  As aluded to later you would have to
> > > > introduce a new concept, something pseudo-observable,
> > > > which you then started doing.  So, see below.
> > > I find it really hard to view this notion of doing work for UB with any 
> > > favor.  The way I see
> > > it is that a program having UB is synonymous with "defective program" and 
> > > for the compiler to
> > > do extra work for these doesn't make much sense to me.
> > 
> > This is also the official position of the C committee on record,
> > but it's one that's now being challenged.

"nonportable or erroneous" is the official position.

> > > If the issue is specifically the handling of overflow traps, perhaps a 
> > > better answer would be
> > > to argue for language changes that manage such events explicitly rather 
> > > than having them be
> > > undefined behavior.  Another (similar) option might be to choose a 
> > > language in which this is
> > > done.  (Is Ada such a language?  I don't remember.)
> > 
> > A change to the language standard is only feasible if it doesn't
> > overly constrain existing implementations. 
> 
> I was thinking that if a new feature is involved, rather than a new 
> definition of behavior for
> existing code, it wouldn't be a constraint on existing implementations (in 
> the sense of "what the
> compiler does for existing code written to the current rules").  In other 
> words, suppose there was
> a concept of "trapping operations" that could be enabled by some new 
> mechanism in the program
> text.  If you use that, then you're asking the compiler to do more work and 
> your code may get
> slower or bigger.  But if you don't, the existing rules apply and nothing bad 
> happens (other than
> that the compiler is somewhat larger and more complex due to the support for 
> both cases).

There are also different proposal for doing something like this,
e.g. making certain undefined behaviour defined as trapping
operations, either as a language variant or by default.

But this is not my idea here, I want to limit the impact of UB
on defective programs - accepting the reality that in the real
world programs often have defects and any serious field of 
engineering needs to deal with this in a better way than to
say "the ISO standard says no requirements - so you loose".

Imagine an aurospace, biomedical, mechanical, or civil engineer
saying: " It makes no sense to consider for the case where one
part fails, this is then just then a defective
airplane/CT scanner/car/bridge.  Not worth spending extra
resources on it, and a non-defective airplane might potentially
be a little bit slower if we were to give you some guarantees
in this failure case. First you need to show that this has no
performance impact at all to anybody anywhere, then maybe we
consider this." (When, at the same time there is quite
substantial damage caused by defective C programs).

I thought limiting the impact of UB on previous defined I/O
would be a rather modest step towards more reliable software,
considering that this is already the case for most I/O and
it seems only some volatile accesses would need fixing (where
I still do not see how this could affect performance anywhere
where it actually matters). 


Martin




Reply via email to