Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Cloudera Oozie spark JDBC访问配置单元GSS启动失败_Java_Apache Spark_Hive_Cloudera_Oozie - Fatal编程技术网

Java Cloudera Oozie spark JDBC访问配置单元GSS启动失败

Java Cloudera Oozie spark JDBC访问配置单元GSS启动失败,java,apache-spark,hive,cloudera,oozie,Java,Apache Spark,Hive,Cloudera,Oozie,我正在尝试从spark连接Hive和JDBC连接。当我从终端使用spark submit选项时,它已成功完成。但是在oozie工作流中,它显示了错误org.apache.hive.jdbc.Utils-无法从ZooKeeper读取HiveServer2配置 这是我的代码片段 Spark提交命令 Oozie工作流xml <workflow-app name="Spark Workflow" xmlns="uri:oozie:workflow:0.5"&

我正在尝试从spark连接Hive和JDBC连接。当我从终端使用
spark submit
选项时,它已成功完成。但是在oozie工作流中,它显示了错误org.apache.hive.jdbc.Utils-无法从ZooKeeper读取HiveServer2配置

这是我的代码片段

Spark提交命令

Oozie工作流xml

<workflow-app name="Spark Workflow" xmlns="uri:oozie:workflow:0.5">
    <start to="spark-cd37"/>
    <kill name="Kill">
        <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <action name="spark-b291">
        <spark xmlns="uri:oozie:spark-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <master>yarn</master>
            <mode>client</mode>
            <name></name>
              <class>com.example.DataLoadMain</class>
            <jar>spark-hive-load_2.11-0.1.jar</jar>
              <spark-opts>--master yarn --deploy-mode client --keytab /var/tmp/saikat.keytab --principal saikat@CLOUDERA.COM --conf spark.yarn.appMasterEnv.HADOOP_JAAS_DEBUG=true --conf spark.yarn.appMasterEnv.SPARK_HOME=/opt/cloudera/parcels/CDH/lib/spark/ --py-files hive-jdbc-3.1.3000.7.1.3.0-100.jar</spark-opts>
            <file>/user/saikat/spark-hive-load_2.11-0.1.jar#spark-hive-load_2.11-0.1.jar</file>
        </spark>
        <ok to="End"/>
        <error to="Kill"/>
    </action>
    <end name="End"/>
</workflow-app>

操作失败,错误消息[${wf:errorMessage(wf:lastErrorNode())}]
${jobTracker}
${nameNode}
纱线
客户
com.example.DataLoadMain
spark-hive-load_2.11-0.1.jar
--主线程--部署模式客户端--keytab/var/tmp/saikat.keytab--主体saikat@CLOUDERA.COM--conf spark.swarn.appMasterEnv.HADOOP_JAAS_DEBUG=true--conf spark.swarn.appMasterEnv.spark_HOME=/opt/cloudera/parcels/CDH/lib/spark/--py files-hive-jdbc-3.1.3000.7.1.3.0-100.jar
/user/saikat/spark-hive-load_2.11-0.1.jar#spark-hive-load_2.11-0.1.jar
Oozie火花动作错误日志

org.apache.hive.jdbc.HiveDriver
17:49:58.240 [main-EventThread] ERROR org.apache.curator.framework.imps.EnsembleTracker - Invalid config event received: {server.1=master3.cloudera.com:3181:4181:participant, version=0, server.3=master1.cloudera.com:3181:4181:participant, server.2=master2.cloudera.com:3181:4181:participant}
17:49:58.251 [main-EventThread] ERROR org.apache.curator.framework.imps.EnsembleTracker - Invalid config event received: {server.1=master3.cloudera.com:3181:4181:participant, version=0, server.3=master1.cloudera.com:3181:4181:participant, server.2=master2.cloudera.com:3181:4181:participant}
17:49:58.401 [main] ERROR org.apache.thrift.transport.TSaslTransport - SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[?:1.8.0_251]
    at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[hive-exec-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [hive-exec-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [hive-exec-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:51) [hive-standalone-metastore-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:48) [hive-standalone-metastore-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at java.security.AccessController.doPrivileged(Native Method) [?:1.8.0_251]
    at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_251]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876) [hadoop-common.jar:?]
    at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport.open(TUGIAssumingTransport.java:48) [hive-standalone-metastore-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:441) [hive-jdbc-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:317) [hive-jdbc-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [hive-jdbc-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at java.sql.DriverManager.getConnection(DriverManager.java:664) [?:1.8.0_251]
    at java.sql.DriverManager.getConnection(DriverManager.java:247) [?:1.8.0_251]
    at com.example.DataLoadMain$.delayedEndpoint$com$example$DataLoadMain$1(DataLoadMain.scala:52) [spark-hive-load_2.11-0.1.jar:0.1]
    at com.example.DataLoadMain$delayedInit$body.apply(DataLoadMain.scala:11) [spark-hive-load_2.11-0.1.jar:0.1]
    at scala.Function0$class.apply$mcV$sp(Function0.scala:34) [scala-library-2.11.12.jar:?]
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12) [scala-library-2.11.12.jar:?]
    at scala.App$$anonfun$main$1.apply(App.scala:76) [scala-library-2.11.12.jar:?]
    at scala.App$$anonfun$main$1.apply(App.scala:76) [scala-library-2.11.12.jar:?]
    at scala.collection.immutable.List.foreach(List.scala:392) [scala-library-2.11.12.jar:?]
    at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35) [scala-library-2.11.12.jar:?]
    at scala.App$class.main(App.scala:76) [scala-library-2.11.12.jar:?]
    at com.example.DataLoadMain$.main(DataLoadMain.scala:11) [spark-hive-load_2.11-0.1.jar:0.1]
    at com.example.DataLoadMain.main(DataLoadMain.scala) [spark-hive-load_2.11-0.1.jar:0.1]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_251]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_251]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_251]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:847) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:922) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:931) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:187) [oozie-sharelib-spark-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:94) [oozie-sharelib-spark-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:107) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:61) [oozie-sharelib-spark-5.1.0.7.1.3.0-100.jar:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_251]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_251]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_251]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
    at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:413) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.LauncherAM.access$400(LauncherAM.java:55) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:226) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at java.security.AccessController.doPrivileged(Native Method) [?:1.8.0_251]
    at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_251]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876) [hadoop-common.jar:?]
    at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:220) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:156) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at java.security.AccessController.doPrivileged(Native Method) [?:1.8.0_251]
    at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_251]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876) [hadoop-common.jar:?]
    at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:144) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
    at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:148) ~[?:1.8.0_251]
    at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) ~[?:1.8.0_251]
    at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:189) ~[?:1.8.0_251]
    at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) ~[?:1.8.0_251]
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.8.0_251]
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_251]
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_251]
    ... 57 more
17:49:58.418 [main-EventThread] ERROR org.apache.curator.framework.imps.EnsembleTracker - Invalid config event received: {server.1=master3.cloudera.com:3181:4181:participant, version=0, server.3=master1.cloudera.com:3181:4181:participant, server.2=master2.cloudera.com:3181:4181:participant}
17:49:58.418 [main-EventThread] ERROR org.apache.curator.framework.imps.EnsembleTracker - Invalid config event received: {server.1=master3.cloudera.com:3181:4181:participant, version=0, server.3=master1.cloudera.com:3181:4181:participant, server.2=master2.cloudera.com:3181:4181:participant}
17:49:58.521 [main] ERROR org.apache.hive.jdbc.Utils - Unable to read HiveServer2 configs from ZooKeeper
org.apache.hive.jdbc.HiveDriver
17:49:58.240[main EventThread]ERROR org.apache.curator.framework.imps.EnsembleTracker-接收到无效的配置事件:{server.1=master3.cloudera.com:3181:4181:participant,version=0,server.3=master1.cloudera.com:3181:4181:participant,server.2=master2.cloudera.com:3181:4181:participant}
17:49:58.251[main EventThread]ERROR org.apache.curator.framework.imps.EnsembleTracker-接收到无效的配置事件:{server.1=master3.cloudera.com:3181:4181:participant,version=0,server.3=master1.cloudera.com:3181:4181:participant,server.2=master2.cloudera.com:3181:4181:participant}
17:49:58.401[main]错误org.apache.thrift.transport.TSaslTransport-SASL协商失败
javax.security.sasl.SaslException:GSS启动失败
在com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)~[?:1.8.0_251]
在org.apache.thrift.transport.TSaslClientTransport.handlesalstartmessage(TSaslClientTransport.java:94)~[hive-exec-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
在org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)[hive-exec-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
在org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)[hive-exec-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
位于org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:51)[hive-standalone-metastore-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
在org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:48)[hive-standalone-metastore-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
位于java.security.AccessController.doPrivileged(本机方法)[?:1.8.0_251]
在javax.security.auth.Subject.doAs(Subject.java:422)[?:1.8.0_251]
在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876)[hadoop common.jar:?]
在org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport.open(TUGIAssumingTransport.java:48)[hive-standalone-metastore-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
在org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:441)[hive-jdbc-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
在org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:317)[hive-jdbc-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
在org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)[hive-jdbc-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
位于java.sql.DriverManager.getConnection(DriverManager.java:664)[?:1.8.0_251]
位于java.sql.DriverManager.getConnection(DriverManager.java:247)[?:1.8.0_251]
在com.example.DataLoadMain$.delayedEndpoint$com$example$DataLoadMain$1(DataLoadMain.scala:52)[spark-hive-load_2.11-0.1.jar:0.1]
在com.example.DataLoadMain$delayedInit$body.apply(DataLoadMain.scala:11)[spark-hive-load_2.11-0.1.jar:0.1]
在scala.Function0$class.apply$mcV$sp(Function0.scala:34)[scala-library-2.11.12.jar:?]
在scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)[scala-library-2.11.12.jar:?]
在scala.App$$anonfun$main$1.apply(App.scala:76)[scala-library-2.11.12.jar:?]
在scala.App$$anonfun$main$1.apply(App.scala:76)[scala-library-2.11.12.jar:?]
在scala.collection.immutable.List.foreach(List.scala:392)[scala-library-2.11.12.jar:?]
位于scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)[scala-library-2.11.12.jar:?]
在scala.App$class.main(App.scala:76)[scala-library-2.11.12.jar:?]
在com.example.DataLoadMain$.main(DataLoadMain.scala:11)[spark-hive-load_2.11-0.1.jar:0.1]
在com.example.DataLoadMain.main(DataLoadMain.scala)[spark-hive-load_2.11-0.1.jar:0.1]
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)~[?:1.8.0_251]
在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)~[?:1.8.0\u 251]
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)~[?:1.8.0251]
在java.lang.reflect.Method.invoke(Method.java:498)~[?:1.8.0_251]
在org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)[spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
在org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:847)[spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.0-100]
在org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)[spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
在org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)[spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
位于org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)[
<workflow-app name="Spark Workflow" xmlns="uri:oozie:workflow:0.5">
    <start to="spark-cd37"/>
    <kill name="Kill">
        <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <action name="spark-b291">
        <spark xmlns="uri:oozie:spark-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <master>yarn</master>
            <mode>client</mode>
            <name></name>
              <class>com.example.DataLoadMain</class>
            <jar>spark-hive-load_2.11-0.1.jar</jar>
              <spark-opts>--master yarn --deploy-mode client --keytab /var/tmp/saikat.keytab --principal saikat@CLOUDERA.COM --conf spark.yarn.appMasterEnv.HADOOP_JAAS_DEBUG=true --conf spark.yarn.appMasterEnv.SPARK_HOME=/opt/cloudera/parcels/CDH/lib/spark/ --py-files hive-jdbc-3.1.3000.7.1.3.0-100.jar</spark-opts>
            <file>/user/saikat/spark-hive-load_2.11-0.1.jar#spark-hive-load_2.11-0.1.jar</file>
        </spark>
        <ok to="End"/>
        <error to="Kill"/>
    </action>
    <end name="End"/>
</workflow-app>
org.apache.hive.jdbc.HiveDriver
17:49:58.240 [main-EventThread] ERROR org.apache.curator.framework.imps.EnsembleTracker - Invalid config event received: {server.1=master3.cloudera.com:3181:4181:participant, version=0, server.3=master1.cloudera.com:3181:4181:participant, server.2=master2.cloudera.com:3181:4181:participant}
17:49:58.251 [main-EventThread] ERROR org.apache.curator.framework.imps.EnsembleTracker - Invalid config event received: {server.1=master3.cloudera.com:3181:4181:participant, version=0, server.3=master1.cloudera.com:3181:4181:participant, server.2=master2.cloudera.com:3181:4181:participant}
17:49:58.401 [main] ERROR org.apache.thrift.transport.TSaslTransport - SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[?:1.8.0_251]
    at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[hive-exec-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [hive-exec-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [hive-exec-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:51) [hive-standalone-metastore-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:48) [hive-standalone-metastore-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at java.security.AccessController.doPrivileged(Native Method) [?:1.8.0_251]
    at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_251]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876) [hadoop-common.jar:?]
    at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport.open(TUGIAssumingTransport.java:48) [hive-standalone-metastore-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:441) [hive-jdbc-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:317) [hive-jdbc-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [hive-jdbc-3.1.3000.7.1.3.0-100.jar:3.1.3000.7.1.3.0-100]
    at java.sql.DriverManager.getConnection(DriverManager.java:664) [?:1.8.0_251]
    at java.sql.DriverManager.getConnection(DriverManager.java:247) [?:1.8.0_251]
    at com.example.DataLoadMain$.delayedEndpoint$com$example$DataLoadMain$1(DataLoadMain.scala:52) [spark-hive-load_2.11-0.1.jar:0.1]
    at com.example.DataLoadMain$delayedInit$body.apply(DataLoadMain.scala:11) [spark-hive-load_2.11-0.1.jar:0.1]
    at scala.Function0$class.apply$mcV$sp(Function0.scala:34) [scala-library-2.11.12.jar:?]
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12) [scala-library-2.11.12.jar:?]
    at scala.App$$anonfun$main$1.apply(App.scala:76) [scala-library-2.11.12.jar:?]
    at scala.App$$anonfun$main$1.apply(App.scala:76) [scala-library-2.11.12.jar:?]
    at scala.collection.immutable.List.foreach(List.scala:392) [scala-library-2.11.12.jar:?]
    at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35) [scala-library-2.11.12.jar:?]
    at scala.App$class.main(App.scala:76) [scala-library-2.11.12.jar:?]
    at com.example.DataLoadMain$.main(DataLoadMain.scala:11) [spark-hive-load_2.11-0.1.jar:0.1]
    at com.example.DataLoadMain.main(DataLoadMain.scala) [spark-hive-load_2.11-0.1.jar:0.1]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_251]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_251]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_251]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:847) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:922) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:931) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [spark-core_2.11-2.4.0.7.1.3.0-100.jar:2.4.0.7.1.3.0-100]
    at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:187) [oozie-sharelib-spark-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:94) [oozie-sharelib-spark-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:107) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:61) [oozie-sharelib-spark-5.1.0.7.1.3.0-100.jar:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_251]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_251]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_251]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
    at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:413) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.LauncherAM.access$400(LauncherAM.java:55) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:226) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at java.security.AccessController.doPrivileged(Native Method) [?:1.8.0_251]
    at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_251]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876) [hadoop-common.jar:?]
    at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:220) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:156) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
    at java.security.AccessController.doPrivileged(Native Method) [?:1.8.0_251]
    at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_251]
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876) [hadoop-common.jar:?]
    at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:144) [oozie-sharelib-oozie-5.1.0.7.1.3.0-100.jar:?]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
    at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:148) ~[?:1.8.0_251]
    at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) ~[?:1.8.0_251]
    at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:189) ~[?:1.8.0_251]
    at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) ~[?:1.8.0_251]
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[?:1.8.0_251]
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_251]
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_251]
    ... 57 more
17:49:58.418 [main-EventThread] ERROR org.apache.curator.framework.imps.EnsembleTracker - Invalid config event received: {server.1=master3.cloudera.com:3181:4181:participant, version=0, server.3=master1.cloudera.com:3181:4181:participant, server.2=master2.cloudera.com:3181:4181:participant}
17:49:58.418 [main-EventThread] ERROR org.apache.curator.framework.imps.EnsembleTracker - Invalid config event received: {server.1=master3.cloudera.com:3181:4181:participant, version=0, server.3=master1.cloudera.com:3181:4181:participant, server.2=master2.cloudera.com:3181:4181:participant}
17:49:58.521 [main] ERROR org.apache.hive.jdbc.Utils - Unable to read HiveServer2 configs from ZooKeeper