Apache spark Oozie Spark作业在将覆盖动态插入配置单元分区表时失败

Apache spark Oozie Spark作业在将覆盖动态插入配置单元分区表时失败,apache-spark,hive,Apache Spark,Hive,我正在尝试使用spark动态插入到分区的配置单元表中。我使用了以下代码: String query = "insert overwrite table %s.%s partition(status_flag) select * from %s"; datasetIFlag().createOrReplaceTempView(metadata.getTargetStoreTableName() + "_tmp"); spark.sql(String.format(query, metadata.g

我正在尝试使用spark动态插入到分区的配置单元表中。我使用了以下代码:

String query = "insert overwrite table %s.%s partition(status_flag) select * from %s";
datasetIFlag().createOrReplaceTempView(metadata.getTargetStoreTableName() + "_tmp");
spark.sql(String.format(query, metadata.getTargetStoreDBName(), metadata.getTargetStoreTableName(), metadata.getTargetStoreTableName() + "_tmp"));
并且还确保status_标志是数据集的最后一列。 我将spark 2.2.0和hive 1.1与CDH5.11一起使用。并在创建spark会话时启用非严格的动态分区。 关于以下例外情况的想法。 spark尝试运行查询时,我遇到以下异常:

    18/06/11 00:49:18 INFO Hive: New loading path = hdfs://***/***/***/complete/hive/**/**/**/**/1528642150383/.hive-staging_hive_2018-06-11_00-49-16_284_2447412172536723108-1/-ext-10000/status_flag=I with partSpec {status_flag=I}
    18/06/11 00:49:19 INFO FileUtils: Creating directory if it doesn't exist: hdfs://***/***/***/complete/hive/**/**/**/**/1528642150383/status_flag=I
    18/06/11 00:49:19 ERROR ApplicationMaster: User class threw exception: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Exception when loading 1 in table test_table with loadPath=hdfs://***/***/***/complete/hive/**/**/**/**/1528642150383/.hive-staging_hive_2018-06-11_00-49-16_284_2447412172536723108-1/-ext-10000;
    org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Exception when loading 1 in table test_table with loadPath=hdfs://***/***/***/complete/hive/**/**/**/**/1528642150383/.hive-staging_hive_2018-06-11_00-49-16_284_2447412172536723108-1/-ext-10000;
            at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:108)
            at org.apache.spark.sql.hive.HiveExternalCatalog.loadDynamicPartitions(HiveExternalCatalog.scala:891)
            at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.run(InsertIntoHiveTable.scala:331)
            at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
            at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
            at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
            at org.apache.spark.sql.Dataset.<init>(Dataset.scala:182)
            at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:67)
            at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)
            at test.common.utils.HDFSUtility.writeIDRecords(HDFSUtility.java:353)
            at test.persist.impl.DataPersistServiceImpl.execute(DataPersistServiceImpl.java:140)
            at test.pipeline.SparkDataPersistPipeline.execute(SparkDataPersistPipeline.java:131)
            at test.pipeline.SparkDataPersistPipeline.main(SparkDataPersistPipeline.java:67)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:498)
            at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:686)
    Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Exception when loading 1 in table test_table with loadPath=hdfs://***/***/***/complete/hive/**/**/**/**/1528642150383/.hive-staging_hive_2018-06-11_00-49-16_284_2447412172536723108-1/-ext-10000
            at org.apache.hadoop.hive.ql.metadata.Hive.loadDynamicPartitions(Hive.java:1714)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:498)
            at org.apache.spark.sql.hive.client.Shim_v0_14.loadDynamicPartitions(HiveShim.scala:772)
            at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply$mcV$sp(HiveClientImpl.scala:698)
            at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply(HiveClientImpl.scala:696)
            at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply(HiveClientImpl.scala:696)
            at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)
            at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:216)
            at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:215)
            at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)
            at org.apache.spark.sql.hive.client.HiveClientImpl.loadDynamicPartitions(HiveClientImpl.scala:696)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply$mcV$sp(HiveExternalCatalog.scala:903)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply(HiveExternalCatalog.scala:891)
        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply(HiveExternalCatalog.scala:891)
        at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99)
        ... 17 more
Caused by: java.util.concurrent.ExecutionException: java.lang.NoSuchMethodError: org.apache.hadoop.hdfs.client.HdfsAdmin.getKeyProvider()Lorg/apache/hadoop/crypto/key/KeyProvider;
        at java.util.concurrent.FutureTask.report(FutureTask.java:122)
        at java.util.concurrent.FutureTask.get(FutureTask.java:192)
        at org.apache.hadoop.hive.ql.metadata.Hive.loadDynamicPartitions(Hive.java:1706)
        ... 34 more
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hdfs.client.HdfsAdmin.getKeyProvider()Lorg/apache/hadoop/crypto/key/KeyProvider;
        at org.apache.hadoop.hive.shims.Hadoop23Shims$HdfsEncryptionShim.<init>(Hadoop23Shims.java:1265)
        at org.apache.hadoop.hive.shims.Hadoop23Shims.createHdfsEncryptionShim(Hadoop23Shims.java:1407)
        at org.apache.hadoop.hive.ql.session.SessionState.getHdfsEncryptionShim(SessionState.java:464)
        at org.apache.hadoop.hive.ql.metadata.Hive.needToCopy(Hive.java:2973)
        at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2874)
        at org.apache.hadoop.hive.ql.metadata.Hive.replaceFiles(Hive.java:3199)
        at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1465)
        at org.apache.hadoop.hive.ql.metadata.Hive$2.call(Hive.java:1685)
        at org.apache.hadoop.hive.ql.metadata.Hive$2.call(Hive.java:1676)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
18/06/11 00:49:18信息配置单元:新加载路径=hdfs://***/***/***/***/complete/Hive/***/***/***/1528642150383/.Hive-staging\u-Hive\u 2018-06-11\u 00-49-16\u 284\u 244741217536723108-1/-ext-10000/status\u flag=I和partSpec{status\u flag=I}
18/06/11 00:49:19信息文件utils:如果目录不存在,则创建目录:hdfs://***/***/***/***/***/complete/hive/***/***/***/***/***/1528642150383/status\u flag=I
18/06/11 00:49:19错误应用程序管理员:用户类引发异常:org.apache.spark.sql.AnalysisException:org.apache.hadoop.hive.ql.metadata.HiveException:在使用loadPath=hdfs://***/***/***/***/complete/hive/***/***/***/***/1528642150383/.hive-staging_-hive_2018-06-11_00-49-16_284_2447412172536723108-1/-ext-10000;
org.apache.spark.sql.AnalysisException:org.apache.hadoop.hive.ql.metadata.HiveException:loadPath=hdfs在表中加载1时发生异常:/***/***/***/***/***/***/complete/hive/***/***/***/**/1528642150383/.hive-staging\u-hive\u 2018-06-11\u 00-49-16\u 284\u 2447412172536723108-1/-ext-10000;
位于org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:108)
位于org.apache.spark.sql.hive.HiveExternalCatalog.loadDynamicPartitions(HiveExternalCatalog.scala:891)
位于org.apache.spark.sql.hive.execution.InsertIntoHiveTable.run(InsertIntoHiveTable.scala:331)
位于org.apache.spark.sql.execution.command.executeCommandExec.sideEffectResult$lzycompute(commands.scala:58)
位于org.apache.spark.sql.execution.command.executeCommandExec.sideEffectResult(commands.scala:56)
位于org.apache.spark.sql.execution.command.executeCommandExec.executeCollect(commands.scala:67)
位于org.apache.spark.sql.Dataset(Dataset.scala:182)
位于org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:67)
位于org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)
位于test.common.utils.HDFSUtility.writeIDRecords(HDFSUtility.java:353)
在test.persist.impl.DataPersistServiceImpl.execute(DataPersistServiceImpl.java:140)
在test.pipeline.SparkDataPersistPipeline.execute(SparkDataPersistPipeline.java:131)
在test.pipeline.SparkDataPersistPipeline.main(SparkDataPersistPipeline.java:67)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:498)
位于org.apache.spark.deploy.warn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:686)
由以下原因引起:org.apache.hadoop.hive.ql.metadata.HiveException:使用loadPath=hdfs加载1个表内测试时异常://***/***/***/***/***/complete/hive/***/***/***/***/1528642150383/.hive-staging\u-hive\u 2018-06-11\u 00-49-16\u 284\u 2447412172536723108-1/-ext-10000
位于org.apache.hadoop.hive.ql.metadata.hive.loadDynamicPartitions(hive.java:1714)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:498)
位于org.apache.spark.sql.hive.client.Shim_v0_14.loadDynamicPartitions(HiveShim.scala:772)
位于org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply$mcV$sp(HiveClientImpl.scala:698)
位于org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply(HiveClientImpl.scala:696)
位于org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply(HiveClientImpl.scala:696)
位于org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)
位于org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:216)
位于org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:215)
位于org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)
位于org.apache.spark.sql.hive.client.HiveClientImpl.loadDynamicPartitions(HiveClientImpl.scala:696)
位于org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply$mcV$sp(HiveExternalCatalog.scala:903)
位于org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply(HiveExternalCatalog.scala:891)
位于org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply(HiveExternalCatalog.scala:891)
位于org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99)
... 还有17个
原因:java.util.concurrent.ExecutionException:java.lang.NoSuchMethodError:org.apache.hadoop.hdfs.client.HdfsAdmin.getKeyProvider()Lorg/apache/hadoop/crypto/key/KeyProvider;
位于java.util.concurrent.FutureTask.report(FutureTask.java:122)
位于java.util.concurrent.FutureTask.get(FutureTask.java:192)
位于org.apache.hadoop.hive.ql.metadata.hive.loadDynamicPartitions(hive.java:1706)
... 34多
原因:java.lang.NoSuchMethodError:org.apache.hadoop.hdfs.client.HdfsAdmin.getKeyProvid