Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/jquery-ui/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hive javax.security.sasl.SaslException:GSS启动失败[由GSSException引起:未提供有效凭据]_Hive_Hiveql_Apache Spark Sql - Fatal编程技术网

Hive javax.security.sasl.SaslException:GSS启动失败[由GSSException引起:未提供有效凭据]

Hive javax.security.sasl.SaslException:GSS启动失败[由GSSException引起:未提供有效凭据],hive,hiveql,apache-spark-sql,Hive,Hiveql,Apache Spark Sql,当我尝试使用Spark SQL HiveContext连接到配置单元元存储时,出现此错误 我使用spark submit命令在独立集群上运行它,该命令来自我的桌面,而不是hadoop集群 这是否与安全相关问题有关?我是否必须在hive_site.xml中添加一些内容?下面的条目中是否有需要更新的内容 <property> <name>hive.metastore.sasl.enabled</name> <value>true<

当我尝试使用Spark SQL HiveContext连接到配置单元元存储时,出现此错误

我使用spark submit命令在独立集群上运行它,该命令来自我的桌面,而不是hadoop集群

这是否与安全相关问题有关?我是否必须在hive_site.xml中添加一些内容?下面的条目中是否有需要更新的内容

<property>
    <name>hive.metastore.sasl.enabled</name>
    <value>true</value>
  </property>
  <property>
    <name>hive.server2.authentication</name>
    <value>kerberos</value>
  </property>

hive.metastore.sasl.enabled
真的
hive.server2.1身份验证
kerberos
spark版本是1.4.0,hive-site.xml放在conf文件夹下

下面是错误日志

15/08/25 18:27:15 INFO HiveContext: Initializing execution hive, version 0.13.1
15/08/25 18:27:16 INFO metastore: Trying to connect to metastore with URI thrift://metastore.com:9083
15/08/25 18:27:16 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Ker
beros tgt)]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:336)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:214)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
        at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:105)
        at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
        at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
        at com.cap1.ct.SparkSQLHive.main(SparkSQLHive.java:17)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
        at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
        at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
        at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
        at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
        ... 35 more
15/08/25 18:27:16 WARN metastore: Failed to connect to the MetaStore Server...
15/08/25 18:27:16 INFO metastore: Waiting 1 seconds before next connection attempt.
15/08/25 18:27:17 INFO metastore: Trying to connect to metastore with URI thrift://metastore.com:9083
15/08/25 18:27:17 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Ker
beros tgt)]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:336)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:214)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
        at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:105)
        at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
        at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
        at com.cap1.ct.SparkSQLHive.main(SparkSQLHive.java:17)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
        at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
        at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
        at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
        at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
        ... 35 more
15/08/25 18:27:15信息HiveContext:初始化执行配置单元,版本0.13.1
15/08/25 18:27:16信息元存储:尝试使用URI连接到元存储thrift://metastore.com:9083
15/08/25 18:27:16错误TSASL传输:SASL协商失败
javax.security.sasl.SaslException:GSS initiate失败[由GSSException引起:未提供有效凭据(机制级别:找不到任何凭据
贝罗斯tgt)]
位于com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
位于org.apache.thrift.transport.TSaslClientTransport.handlesalstartmessage(TSaslClientTransport.java:94)
位于org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
位于org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
位于org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
位于org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
位于java.security.AccessController.doPrivileged(本机方法)
位于javax.security.auth.Subject.doAs(Subject.java:422)
位于org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
位于org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
位于org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:336)
位于org.apache.hadoop.hive.metastore.HiveMetaStoreClient。(HiveMetaStoreClient.java:214)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:422)
位于org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
位于org.apache.hadoop.hive.metastore.RetryingMetaStoreClient。(RetryingMetaStoreClient.java:62)
位于org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
位于org.apache.hadoop.hive.ql.metadata.hive.createMetaStoreClient(hive.java:2453)
位于org.apache.hadoop.hive.ql.metadata.hive.getMSC(hive.java:2465)
位于org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
位于org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:105)
位于org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
位于org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
位于org.apache.spark.sql.hive.HiveContext(HiveContext.scala:167)
位于com.cap1.ct.SparkSQLHive.main(SparkSQLHive.java:17)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:497)
位于org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
位于org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
位于org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
位于org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
位于org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
原因:GSSExException:未提供有效凭据(机制级别:找不到任何Kerberos tgt)
位于sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
位于sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
位于sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
位于sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
位于sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
位于sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
位于com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
…还有35个
15/08/25 18:27:16警告元存储:无法连接到元存储服务器。。。
15/08/25 18:27:16信息元存储:在下一次连接尝试之前等待1秒钟。
15/08/25 18:27:17信息元存储:尝试使用URI连接到元存储thrift://metastore.com:9083
15/08/25 18:27:17错误TSASL传输:SASL协商失败
javax.security.sasl.SaslException:GSS initiate失败[由GSSException引起:未提供有效凭据(机制级别:找不到任何凭据
贝罗斯tgt)]
位于com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
位于org.apache.thrift.transport.TSaslClientTransport.handlesalstartmessage(TSaslClientTransport.java:94)
位于org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
位于org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
在
GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos Ticket)
Cause: This may occur if no valid Kerberos credentials are obtained. In particular, this occurs if you want the underlying mechanism to obtain credentials but you forgot to indicate this by setting the javax.security.auth.useSubjectCredsOnly system property value to false (for example via -Djavax.security.auth.useSubjectCredsOnly=false in your execution command).