Java 寻找SnappyCodec的Avro正在抛出NoClassDefFoundError
我浏览了整个网络,发现这是一个非常常见的错误,但没有解决方案能帮到我 我读的是卡夫卡主题。到目前为止,我这样做还没有遇到问题,但现在我在aws环境中的flink cluster Up上运行时遇到了此错误,而不是在我的IDE(intellij):Java 寻找SnappyCodec的Avro正在抛出NoClassDefFoundError,java,maven,noclassdeffounderror,avro,apache-flink,Java,Maven,Noclassdeffounderror,Avro,Apache Flink,我浏览了整个网络,发现这是一个非常常见的错误,但没有解决方案能帮到我 我读的是卡夫卡主题。到目前为止,我这样做还没有遇到问题,但现在我在aws环境中的flink cluster Up上运行时遇到了此错误,而不是在我的IDE(intellij): NoClassDefFoundError: org/xerial/snappy/Snappy at org.apache.avro.file.SnappyCodec.decompress(SnappyCodec.java:58) at org.apach
NoClassDefFoundError: org/xerial/snappy/Snappy
at org.apache.avro.file.SnappyCodec.decompress(SnappyCodec.java:58)
at org.apache.avro.file.DataFileStream$DataBlock.decompressUsing(DataFileStream.java:352)
at org.apache.avro.file.DataFileStream.hasNext(DataFileStream.java:199)
at flink.streaming.mtsas.functions.AvroDeserializationSchema.deserialize(AvroDeserializationSchema.java:37)
at org.apache.flink.streaming.util.serialization.KeyedDeserializationSchemaWrapper.deserialize(KeyedDeserializationSchemaWrapper.java:39)
at org.apache.flink.streaming.connectors.kafka.internal.Kafka09Fetcher.runFetchLoop(Kafka09Fetcher.java:145)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:255)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:87)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:55)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:95)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:262)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702)
at java.lang.Thread.run(Thread.java:745)
从我在网上找到的资料来看,简单地说,通常的原因是编译的版本与预期的不同。但我只是不知所措。以下是pom.xml文件:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.group.version1</groupId>
<artifactId>parent</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<artifactId>this.artifact</artifactId>
<properties>
<flink.version>1.3.0</flink.version>
<avro.version>1.8.1</avro.version>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-nifi_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.8.8</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.0</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.8.1</version>
</dependency>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>2.4.5</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>21.0</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>${avro.version}</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-maven-plugin</artifactId>
<version>${avro.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.10_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-cassandra_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-filesystem_2.10</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro</artifactId>
<version>0.10.2</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.2.0</version>
</dependency>
<dependency>
<groupId>com.blah</groupId>
<artifactId>custom-resources</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>com.blah</groupId>
<artifactId>custom-executor</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>net.sf.dozer</groupId>
<artifactId>dozer</artifactId>
<version>5.4.0</version>
</dependency>
<dependency>
<groupId>com.group.version1</groupId>
<artifactId>test-utils</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-statebackend-rocksdb_2.11</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-dse</artifactId>
<version>3.0.0-rc1</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>dse-driver</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.avro</groupId>
<artifactId>avro-maven-plugin</artifactId>
<version>${avro.version}</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>schema</goal>
</goals>
<configuration>
<sourceDirectory>${project.basedir}/src/main/customavro/</sourceDirectory>
<outputDirectory>${project.basedir}/src/main/java/</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<manifestEntries>
<Main-Class>flink.streaming.custom.CustomProcessor</Main-Class>
</manifestEntries>
</transformer>
</transformers>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<version>2.5</version>
<executions>
<execution>
<id>inst_1</id>
<phase>clean</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<groupId>com.blah</groupId>
<artifactId>custom-resources</artifactId>
<version>1.0</version>
<packaging>jar</packaging>
<file>${basedir}/lib/custom_resources-1.0.jar</file>
</configuration>
</execution>
<execution>
<id>inst_2</id>
<phase>clean</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<groupId>com.blah</groupId>
<artifactId>custom-executor</artifactId>
<version>1.0</version>
<packaging>jar</packaging>
<file>${basedir}/lib/custom-executor-1.0.jar</file>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
4.0.0
com.group.version1
父母亲
1.0-快照
这是人工制品
1.3.0
1.8.1
1.8
1.8
org.apache.flink
弗林克爪哇
${flink.version}
org.apache.flink
flink-connector-nifi_2.11
${flink.version}
org.apache.flink
flink-streaming-java_2.11
${flink.version}
com.fasterxml.jackson.core
杰克逊核心
2.8.8
com.google.code.gson
格森
2.8.0
org.apache.logging.log4j
log4j型芯
2.8.1
org.codehaus.groovy
groovy all
2.4.5
番石榴
番石榴
21
com.googlecode.json-simple
简单json
1.1.1
org.apache.avro
阿夫罗
${avro.version}
org.apache.avro
avro-maven插件
${avro.version}
org.apache.flink
flink-connector-kafka-0.10_2.11
${flink.version}
org.apache.flink
flink-connector-cassandra_2.11
${flink.version}
org.apache.flink
flink-connector-U 2.10
${flink.version}
org.apache.flink
弗林克·阿夫罗
0.10.2
com.datasax.cassandra
卡桑德拉驱动核心
3.2.0
胡说八道
自定义资源
1
胡说八道
海关执行人
1
net.sf.推土机
推土机
5.4.0
com.group.version1
测试用例
测试
org.slf4j
slf4j简单
org.apache.flink
flink-StateBend-rocksdb_2.11
1.2.1
com.datasax.cassandra
卡桑德拉驱动程序
3.0.0-rc1
com.datasax.cassandra
dse驱动器
1.1.2
org.apache.hadoop
hadoop通用
2.7.2
org.apache.avro
avro-maven插件
${avro.version}
生成源
模式
${project.basedir}/src/main/customavro/
${project.basedir}/src/main/java/
org.apache.maven.plugins
maven阴影插件
包裹
阴凉处
flink.streaming.custom.CustomProcessor
*:*
META-INF/*.SF
META-INF/*.DSA
META-INF/*.RSA
org.apache.maven.plugins
maven安装插件
2.5
研究所1
清洁的
安装文件
胡说八道
自定义资源
1
罐子
${basedir}/lib/custom_resources-1.0.jar
研究所2
清洁的
安装文件
胡说八道
海关执行人
1
罐子
${basedir}/lib/custom-executor-1.0.jar
我相信这并不是一件很难的事,但我真的已经到了这样一个地步,你一件接一件的尝试最终弊大于利
提前谢谢你的帮助
编辑:
还应该补充的是,我可以在IDE中的该路径(org.xerial.Snappy.Snappy)找到Snappy.java。与data artisans(Flink的创建者)的联系人交谈后,发现1.3中有一个bug,这就是原因。这里是错误报告的链接 与data artisans(Flink的创建者)的联系人交谈后,发现1.3中存在一个bug,这就是原因所在。这里是错误报告的链接