Hi Sudhir,

According to my calculations based on the number of block (144,385,717)
comes out close to 132 GB of heap memory. I think you are doing fine.

Thanks,

Ramdas

On Tue, Jan 29, 2019 at 5:09 PM Sudhir Babu Pothineni <[email protected]>
wrote:

>
> One of Hadoop cluster I am working
>
> 85,985,789 files and directories, 58,399,919 blocks = 144,385,717 total
> file system objects
>
> Heap memory used 132.0 GB of 256 GB Heap Memory.
>
> I feel it’s odd the ratio of files vs blocks way higher showing more of
> small files problem,
>
> But the cluster working fine. Am I worrying unnecessarily? we are using
> Hadoop 2.6.0
>
> Thanks
> Sudhir
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
>

Reply via email to