Hi,

I suggest you use shell command for accessing cluster info instead of curl 
command.

For hdfs shell command you can refer

https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/HDFSCommands.html

For yarn shell command you can refer

https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/YarnCommands.html#application


Regards,
Surendra
From:Atul Rajan
To:surendra lilhore
Cc:[email protected]
Date:2017-08-24 21:43:30
Subject:Re: Data streamer java exception

Hello Team,

I come to resolution of this issue by allowing the iproute table entry for the 
specific ports used for namenode as well as datanode. now hdfs is running and 
cluster is running.

thanks a lot for the suggestion. now i have another issue of interface as i am 
running console view of RHEL is their any way i can connect to webinterface by 
url so that interface and jobs details are visible.??

On 24 August 2017 at 19:18, surendra lilhore 
<[email protected]<mailto:[email protected]>> wrote:
Hi Atul,

Please can you share the datanode exception logs ?. Check if namenode and 
datanode hostname mapping is proper or not  in /etc/hosts.

Put operation is failing because datanode’s are not connected to the namenode.

-Surendra

From: Atul Rajan [mailto:[email protected]<mailto:[email protected]>]
Sent: 24 August 2017 09:32
To: [email protected]<mailto:[email protected]>
Subject: Data streamer java exception

Hello Team,

I am setting up a hadoop 3.0 alpha cluster of 4 nodes in RHEL 7.2 everything is 
set namenode datanode resource manager node manager but data node is not able 
to connect to namenode i am getting retrying logs in datanode.
Also when copying files from local to hdfs data streamer java exceptions are 
being thrown.

can you please help me out here.
Thanks and Regards
Atul Rajan

-Sent from my iPhone



--
Best Regards
Atul Rajan

Reply via email to