hi,all: My environment : Centos7.2 hadoop2.7.3 jdk1.8 kerberos 1.13.2-12.el7_2 my hadoop user is hadoop,now I can startup hdfs with sbin/start-dfs.sh and I can see data node from the web. but when I try to hdfs dfs -ls / ,I got error:
[hadoop@dmp1 ~]$ hdfs dfs -ls / 16/09/20 15:00:44 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "dmp1.example.com/192.168.249.129"; destination host is: "dmp1.example.com":9000; [hadoop@dmp1 ~]$ klist Ticket cache: KEYRING:persistent:1004:1004 Default principal: [email protected] Valid starting Expires Service principal 09/20/2016 14:57:34 09/21/2016 14:57:31 krbtgt/[email protected] renew until 09/27/2016 14:57:31 [hadoop@dmp1 ~]$ It's because of my jdk is 1.8 ? 2016-09-21 lk_hadoop
