https://gcc.gnu.org/bugzilla/show_bug.cgi?id=95348

--- Comment #6 from Martin Liška <marxin at gcc dot gnu.org> ---
(In reply to qinzhao from comment #5)
> (In reply to Martin Liška from comment #4)> 
> > Can you please share some statistics how big are the files and how many 
> > runs do you merge?
> 
>   There were on the order of 10,000 processes. Source code coverage 
>   approximately 20%. Size of the profiling data gathered in the vicinity of
> 1TB.

Which means one run takes 100MB is size, right? As you mentioned, having 1000
.gcda files, it means that one takes 0.1MB?

> 
> > Would it be possible to share 'gcov-dump -l' for all your .gcda files?
> 
> It is impossible since too many .gdca files, each process has one directory,
> there are over 10,000 directories and under each directory, there are over
> thousand .gdca files. 
> 
> the situation is similar as the small testing case I added in the first
> comment. the functions and modules that do not execute have records in the
> .gdca file with zero counts.

Can you please provide dump of one directory? At least for portion of .gcda
files?
How is it common that an entire module is empty?

Reply via email to