Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/316.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Flink读取S3文件导致Jackson依赖性问题_Java_Aws Sdk_Apache Flink - Fatal编程技术网

Java Flink读取S3文件导致Jackson依赖性问题

Java Flink读取S3文件导致Jackson依赖性问题,java,aws-sdk,apache-flink,Java,Aws Sdk,Apache Flink,我正在读取链接应用程序中的配置YAML文件。我想把这个配置文件保存在S3文件系统上,但是当在pom中使用aws sdk并试图读取时,我得到了这个错误。我知道这是由于杰克逊的冲突依赖,但我无法解决它。请帮我解决这个问题 java.lang.NoSuchMethodError:com.fasterxml.jackson.databind.ObjectMapper.enable([Lcom/fasterxml/jackson/core/JsonParser$Feature;)Lcom/fasterxm

我正在读取链接应用程序中的配置YAML文件。我想把这个配置文件保存在S3文件系统上,但是当在pom中使用aws sdk并试图读取时,我得到了这个错误。我知道这是由于杰克逊的冲突依赖,但我无法解决它。请帮我解决这个问题

java.lang.NoSuchMethodError:com.fasterxml.jackson.databind.ObjectMapper.enable([Lcom/fasterxml/jackson/core/JsonParser$Feature;)Lcom/fasterxml/jackson/databind/ObjectMapper; 位于com.amazonaws.partitions.PartitionsLoader。(PartitionsLoader.java:54) 位于com.amazonaws.regions.RegionMetadataFactory.create(RegionMetadataFactory.java:30) 在com.amazonaws.regions.RegionUtils.initialize(RegionUtils.java:65)上 在com.amazonaws.regions.RegionUtils.getRegionMetadata(RegionUtils.java:53)上 位于com.amazonaws.regions.RegionUtils.getRegion(RegionUtils.java:107) 位于com.amazonaws.services.s3.AmazonS3Client.createSigner(AmazonS3Client.java:4016) 位于com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4913) 位于com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4872) 位于com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:1472) 位于com.bounce.processor.EventProcessor.main(EventProcessor.java:71) 在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处 位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中 位于java.lang.reflect.Method.invoke(Method.java:498) 位于org.apache.flink.client.program.PackagedProgram.callmain方法(PackagedProgram.java:576)

这是我用来读取文件的代码

        AmazonS3 amazonS3Client = new AmazonS3Client(credentials);

        S3Object object = amazonS3Client.getObject(new GetObjectRequest(S3_PROD_BUCKET, para.get("topology")));
        InputStream awsinputStream = object.getObjectContent();
这是我的pom.xml

        <!-- Flink dependencies -->

        <dependency>
            <groupId>io.confluent</groupId>
            <artifactId>kafka-avro-serializer</artifactId>
            <version>5.3.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>${hadoop.version}</version>
            <exclusions>
                <exclusion>
                    <groupId>commons-httpclient</groupId>
                    <artifactId>commons-httpclient</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.httpcomponents</groupId>
                    <artifactId>httpclient</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.httpcomponents</groupId>
                    <artifactId>httpcore</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>${flink.format.parquet.version}</version>
        </dependency>

        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>1.7.7</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.17</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-parquet_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>joda-time</groupId>
            <artifactId>joda-time</artifactId>
            <version>2.10.5</version>
        </dependency>

        <dependency>
            <groupId>com.fasterxml.jackson.dataformat</groupId>
            <artifactId>jackson-dataformat-yaml</artifactId>
            <version>2.10.2</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.esotericsoftware.yamlbeans/yamlbeans -->
        <dependency>
            <groupId>com.esotericsoftware.yamlbeans</groupId>
            <artifactId>yamlbeans</artifactId>
            <version>1.13</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.uber/h3 -->
        <dependency>
            <groupId>com.uber</groupId>
            <artifactId>h3</artifactId>
            <version>3.6.3</version>
        </dependency>

        <dependency>
            <groupId>com.github.davidmoten</groupId>
            <artifactId>geo</artifactId>
            <version>0.7.7</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-connector-elasticsearch6 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-elasticsearch7_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>tech.allegro.schema.json2avro</groupId>
            <artifactId>converter</artifactId>
            <version>0.2.9</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-statebackend-rocksdb_2.12</artifactId>
            <version>1.10.0</version>
            <scope>provided</scope>
        </dependency>

        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.13</version>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro-confluent-registry</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-elasticsearch7_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>


        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-bundle</artifactId>
            <version>1.11.756</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind -->
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
            <version>2.10.0</version>
        </dependency>

合流的
卡夫卡avro序列化程序
5.3.0
org.apache.flink
flink-streaming-java_${scala.binary.version}
${flink.version}
org.apache.flink
flink-connector-kafka_${scala.binary.version}
${flink.version}
org.apache.flink
flink-connector-filesystem_${scala.binary.version}
${flink.version}
org.apache.hadoop
hadoop通用
${hadoop.version}
commons httpclient
commons httpclient
org.apache.httpcomponents
httpclient
org.apache.httpcomponents
httpcore
org.apache.flink
弗林克·阿夫罗
${flink.version}
org.apache.parquet
镶木地板
${flink.format.parquet.version}
org.slf4j
slf4j-log4j12
1.7.7
运行时
log4j
log4j
1.2.17
运行时
org.apache.flink
flink-parquet_2.11
${flink.version}
乔达时间
乔达时间
2.10.5
com.fasterxml.jackson.dataformat
jackson数据格式yaml
2.10.2
com.esotericsoftware.yamlbeans
亚姆伯恩
1.13
com.uber
h3
3.6.3
com.github.davidmoten
地理位置
0.7.7
org.apache.flink
flink-connector-elasticsearch7_2.11
${flink.version}
tech.allegro.schema.json2avro
转换器
0.2.9
org.apache.flink
flink-StateBend-rocksdb_2.12
1.10.0
假如
朱尼特
朱尼特
4.13
测试
org.apache.flink
弗林克-阿夫罗汇合登记处
${flink.version}
org.apache.flink
flink-connector-elasticsearch7_2.11
${flink.version}
亚马逊网站
aws java sdk包
1.11.756
com.fasterxml.jackson.core
杰克逊数据绑定
2.10.0

完全从POM获取冲突的答案是不太可能的。相反,您应该通过调用以下命令来参考maven依赖插件:

mvn依赖项:树

这将打印所有依赖项以及这些依赖项的依赖项,通过这种方式,您将能够找到正在导入的库中哪些库对Jackson具有可传递依赖项,并且您将能够将其标记为已排除


注意:此依赖关系树中真正需要的是具有不同版本的Jackson依赖关系,因此不需要将它们全部排除。

我在我的代码系统.out.println(ObjectMapper.class.getProtectionDomain().getCodeSource()中尝试了这一点;并打印输出:/usr/lib/flink/lib/flink-dist_2.11-1.9.1.jarYeah,但这并不意味着没有冲突。你真的应该使用依赖关系树。我已经尝试使用mvn依赖关系树并尝试排除所有Jackson依赖关系,但仍然无法解决。请提供解决方案。你是哪一个flink版本使用?从Flink 1.4开始,Flink将为Jackson遮挡,以避免此类冲突:我使用的是Flink 1.9。1@RobertMetzger我认为我的jar项目没有问题,它是从Flink lib dir/usr/l中获取Jackson jar的