One of Hadoop cluster I am working

85,985,789 files and directories, 58,399,919 blocks = 144,385,717 total file 
system objects

Heap memory used 132.0 GB of 256 GB Heap Memory.

I feel it’s odd the ratio of files vs blocks way higher showing more of small 
files problem, 

But the cluster working fine. Am I worrying unnecessarily? we are using Hadoop 
2.6.0

Thanks
Sudhir
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to