如何修复java.lang.IllegalArgumentException:无法识别的Hadoop主要版本号:3.1.1

如何修复java.lang.IllegalArgumentException:无法识别的Hadoop主要版本号:3.1.1,java,hadoop,apache-nifi,Java,Hadoop,Apache Nifi,NiFi的自定义处理器出现错误“无法处理会话,因为无法识别Hadoop主要版本号:3.1.1;处理器管理性生成1秒:java.lang.IllegalArgumentException:无法识别的Hadoop主要版本号:3.1.1” 我在项目中开发了具有以下依赖项的CustomSelectHiveQL。在环境中,Hadoop版本是3.1.1.3.1.5.0-152。我已经尝试使用相同的版本,但它无法导入此版本的jar 这里是依赖项。 <nifi.version>1.11.

NiFi的自定义处理器出现错误“无法处理会话,因为无法识别Hadoop主要版本号:3.1.1;处理器管理性生成1秒:java.lang.IllegalArgumentException:无法识别的Hadoop主要版本号:3.1.1”

我在项目中开发了具有以下依赖项的CustomSelectHiveQL。在环境中,Hadoop版本是3.1.1.3.1.5.0-152。我已经尝试使用相同的版本,但它无法导入此版本的jar

这里是依赖项。

      <nifi.version>1.11.4</nifi.version>
      <hadoop.version>3.1.1</hadoop.version>
      <hive.version>3.1.0</hive.version>
      </properties>

 <artifactId>nifi-mddof-processors</artifactId>
  <packaging>jar</packaging>

  <dependencies>
      <dependency>
          <groupId>org.apache.nifi</groupId>
          <artifactId>nifi-api</artifactId>
          <version>${nifi.version}</version>
      </dependency>
      <dependency>
          <groupId>org.apache.nifi</groupId>
          <artifactId>nifi-utils</artifactId>
          <version>${nifi.version}</version>
          <scope>provided</scope>
      </dependency>
      <dependency>
          <groupId>org.apache.nifi</groupId>
          <artifactId>nifi-mock</artifactId>
          <version>${nifi.version}</version>
          <scope>test</scope>
      </dependency>
      <dependency>
          <groupId>org.slf4j</groupId>
          <artifactId>slf4j-simple</artifactId>
          <scope>test</scope>
      </dependency>
      <dependency>
          <groupId>junit</groupId>
          <artifactId>junit</artifactId>
          <scope>test</scope>
      </dependency>
      <dependency>
          <groupId>org.apache.nifi</groupId>
          <artifactId>nifi-record</artifactId>
          <scope>compile</scope>
      </dependency>

      <dependency>
          <groupId>org.apache.nifi</groupId>
          <artifactId>nifi-record-serialization-service-api</artifactId>
          <version>${nifi.version}</version>
      </dependency>
      <dependency>
          <groupId>org.apache.nifi</groupId>
          <artifactId>nifi-hadoop-utils</artifactId>
          <exclusions>
              <exclusion>
                  <groupId>org.apache.hadoop</groupId>
                  <artifactId>hadoop-common</artifactId>
              </exclusion>
          </exclusions>
          <version>${nifi.version}</version>
      </dependency>
      <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-common</artifactId>
          <version>${hadoop.version}</version>
          <scope>provided</scope>
      </dependency>
  </dependencies> ```


**Here the logs of the error:**


```2020-09-03 06:24:36,398 WARN [Timer-Driven Process Thread-21] o.a.n.controller.tasks.ConnectableTask Administratively Yielding CustomSelectHiveQL[id=4ef4abad-0174-1000-ffff-ffff913dd5aa] due to uncaught Exception: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.1
java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.1
      at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:174)
      at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:139)
      at org.apache.hadoop.hive.shims.ShimLoader.getHadoopThriftAuthBridge(ShimLoader.java:125)
      at org.apache.hive.service.auth.KerberosSaslHelper.getKerberosTransport(KerberosSaslHelper.java:54)
      at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:445)
      at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:201)
      at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)
      at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
      at java.sql.DriverManager.getConnection(DriverManager.java:664)
      at java.sql.DriverManager.getConnection(DriverManager.java:270)
      at com.o2.edh.mddof.processors.customHive.CustomSelectHiveQL.getConnection(CustomSelectHiveQL.java:799)
      at com.o2.edh.mddof.processors.customHive.CustomSelectHiveQL.onTrigger(CustomSelectHiveQL.java:355)
      at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
      at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1176)
      at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:213)
      at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
      at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110)
      at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
      at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
      at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      at java.lang.Thread.run(Thread.java:748) ```
1.11.4
3.1.1
3.1.0
nifi多自由度处理器
罐子
org.apache.nifi
nifi api
${nifi.version}
org.apache.nifi
nifi-utils
${nifi.version}
假如
org.apache.nifi
nifi模拟
${nifi.version}
测试
org.slf4j
slf4j简单
测试
朱尼特
朱尼特
测试
org.apache.nifi
nifi记录
编译
org.apache.nifi
nifi记录序列化服务api
${nifi.version}
org.apache.nifi
nifi hadoop utils
org.apache.hadoop
hadoop通用
${nifi.version}
org.apache.hadoop
hadoop通用
${hadoop.version}
假如
```
**以下是错误日志:**
```2020-09-03 06:24:36398警告[Timer Driven Process Thread-21]o.a.n.controller.tasks.ConnectableTask管理生成CustomSelectHiveQL[id=4ef4abad-0174-1000-ffff-ffff913dd5aa]由于未捕获异常:java.lang.IllegalArgumentException:无法识别的Hadoop主要版本号:3.1.1
java.lang.IllegalArgumentException:无法识别的Hadoop主要版本号:3.1.1
在org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:174)上
位于org.apache.hadoop.hive.shimmers.ShimLoader.loadshimmers(ShimLoader.java:139)
在org.apache.hadoop.hive.shimmes.ShimLoader.getHadoopThriftAuthBridge(ShimLoader.java:125)上
位于org.apache.hive.service.auth.KerberosSaslHelper.getkerberostronsport(KerberosSaslHelper.java:54)
位于org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:445)
位于org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:201)
位于org.apache.hive.jdbc.HiveConnection。(HiveConnection.java:176)
位于org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
位于java.sql.DriverManager.getConnection(DriverManager.java:664)
位于java.sql.DriverManager.getConnection(DriverManager.java:270)
在com.o2.edh.mddof.processors.customHive.CustomSelectHiveQL.getConnection上(CustomSelectHiveQL.java:799)
位于com.o2.edh.mddof.processors.customHive.CustomSelectHiveQL.onTrigger(CustomSelectHiveQL.java:355)
位于org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
位于org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1176)
位于org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:213)
位于org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
位于org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110)
位于java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
位于java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
位于java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
位于java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
位于java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
位于java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
运行(Thread.java:748)```

您需要检查编译NiFi 1.11.4所依据的Hadoop版本。它可能早于3.1.1。感谢您的回复,它是用hadoop版本2.6.2编译的。但是,我已经从nifi hadoop utils依赖中排除了hadoop-common依赖。您可以在上面的依赖项中找到相同的内容。您需要检查编译NiFi 1.11.4所依据的Hadoop版本。它可能早于3.1.1。感谢您的回复,它是用hadoop版本2.6.2编译的。但是,我已经从nifi hadoop utils依赖中排除了hadoop-common依赖。您可以在上面的依赖项中找到相同的内容。