https://gcc.gnu.org/bugzilla/show_bug.cgi?id=109861
anlauf at gcc dot gnu.org changed:
What |Removed |Added
----------------------------------------------------------------------------
Last reconfirmed| |2023-05-15
Status|UNCONFIRMED |WAITING
Ever confirmed|0 |1
CC| |anlauf at gcc dot gnu.org
--- Comment #1 from anlauf at gcc dot gnu.org ---
I believe the sample code is misleading, and the behavior is expected:
SUBROUTINE h5aread_async_f(buf)
TYPE(C_PTR), INTENT(OUT) :: buf
You can see what happens if you specify the flag -fdump-tree-original as
part of F90FLAGS.
Now compare the dump-tree for INTENT(INOUT) vs. INTENT(OUT):
--- fcode.F90.005t.original.inout 2023-05-15 20:03:07.292148948 +0200
+++ fcode.F90.005t.original.out 2023-05-15 20:03:36.292208016 +0200
@@ -19,6 +19,7 @@
D.4223 = (void *) &attr_rdata0;
f_ptr = D.4223;
}
+ f_ptr = {CLOBBER};
h5aread_async_f (&f_ptr);
{
struct __st_parameter_dt dt_parm.0;
When the dummy argument buf is declared with INTENT(OUT), we mark the
actual argument in the caller with CLOBBER, which means that the optimizer
may throw away previous calculations and assignments as they do not matter.
If you add a line
print '(Z16.16)', buf
into that subroutine, you'll see that the clobber annotation serves its
purpose once optimization is enabled.
I think that you might want to cross-check your testcase with the NAG
compiler, or some other compiler which provides a means to initialize
INTENT(OUT) arguments to detect such code.