启用Kerberos后无法访问Hadoop CLI

启用Kerberos后无法访问Hadoop CLI,hadoop,kerberos,mit-kerberos,Hadoop,Kerberos,Mit Kerberos,我遵循了以下教程,NameNode和DataNode能够正确启动,并且我能够看到WebUI上列出的所有DataNode(0.0.0.0:50070)。但是我无法访问Hadoop CLI。我已经学习了本教程,但仍然无法使用Hadoop CLI [root@local9 hduser]# hadoop fs -ls / 20/11/03 12:24:32 WARN security.UserGroupInformation: PriviledgedActionException as:root (a

我遵循了以下教程,NameNode和DataNode能够正确启动,并且我能够看到WebUI上列出的所有DataNode(0.0.0.0:50070)。但是我无法访问Hadoop CLI。我已经学习了本教程,但仍然无法使用Hadoop CLI

[root@local9 hduser]# hadoop fs -ls /
20/11/03 12:24:32 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
20/11/03 12:24:32 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
20/11/03 12:24:32 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "local9/192.168.2.9"; destination host is: "local9":8020;
[root@local9 hduser]# klist
Ticket cache: KEYRING:persistent:0:krb_ccache_hVEAjWz
Default principal: hdfs/local9@FBSPL.COM

Valid starting       Expires              Service principal
11/03/2020 12:22:42  11/04/2020 12:22:42  krbtgt/FBSPL.COM@FBSPL.COM
        renew until 11/10/2020 12:22:12
[root@local9 hduser]# kinit -R
[root@local9 hduser]# klist
Ticket cache: KEYRING:persistent:0:krb_ccache_hVEAjWz
Default principal: hdfs/local9@FBSPL.COM

Valid starting       Expires              Service principal
11/03/2020 12:24:50  11/04/2020 12:24:50  krbtgt/FBSPL.COM@FBSPL.COM
        renew until 11/10/2020 12:22:12
[root@local9 hduser]# hadoop fs -ls /
20/11/03 12:25:04 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
20/11/03 12:25:04 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
20/11/03 12:25:04 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "local9/192.168.2.9"; destination host is: "local9":8020;
任何帮助都将不胜感激。

我解决了这个问题。 这是Redhat中的缓存凭据错误: 然后我在Cloudera上的Kerberos上找到了以下文档:

最后,解决方案是从
/etc/krb5.conf

default\u ccache\u name=KEYRING:persistent:%{uid}


在评论这一行之后,我能够访问Hadoop CLI。

您的第一个链接说“对于Hadoop,主体的格式应该是username/**fully.qualified.domain.name**@Your-REALM.COM”
local9
不是完全限定的域名。local9是完全限定的域名。hostname-f将local9作为输出。不,不是。我已经使用hostnamectl将主机名设置为local9-将hostname和/etc/hostname设置为local9。因此,我的fqdn是local9