https://gcc.gnu.org/bugzilla/show_bug.cgi?id=98577
--- Comment #13 from Chinoune <mehdi.chinoune at hotmail dot com> ---
$ gfortran bug_gcc_tic.f90 -o test.x
count(int_32):
count_rate(int32) = 1000
count_rate(int64) = 1000
count_rate(real32) = 1000.00000
count_rate(real64) = 1000.0000000000000
count(int_64):
count_rate(int32) = 1000
count_rate(int64) = 1000000000
count_rate(real32) = 1000.00000
count_rate(real64) =
1000000000.0000000
The odd with gfortran is that it reports the same value for count_rate (1000)
with all the kinds (int32,int64,real32,real64) when the first argument is
int32. but it reports different values when the first argument is int64.
The question:
In which factor does count_rate value depends: its precision, the first
argument precision or just a mixup?
