Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/399.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 使用生成器创建spark会话时出现NoSuchFieldException_Java_Apache Spark_Hadoop - Fatal编程技术网

Java 使用生成器创建spark会话时出现NoSuchFieldException

Java 使用生成器创建spark会话时出现NoSuchFieldException,java,apache-spark,hadoop,Java,Apache Spark,Hadoop,我正在尝试使用以下方法创建spark会话: sparkSession = SparkSession.builder().appName(appName).master("local") .config("hive.metastore.uris", thriftURL).enableHiveSupport().getOrCreate(); 但它给出了如下NoSuchField例外: 2020-10-27 20:51:26.9

我正在尝试使用以下方法创建spark会话:

sparkSession = SparkSession.builder().appName(appName).master("local")
                .config("hive.metastore.uris", thriftURL).enableHiveSupport().getOrCreate();
但它给出了如下NoSuchField例外:

2020-10-27 20:51:26.963  WARN 11206 --- [  restartedMain] org.apache.hadoop.util.NativeCodeLoader  : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2020-10-27 20:51:27.053  INFO 11206 --- [  restartedMain] org.apache.spark.SparkContext            : Submitted application: HdfsPoc
2020-10-27 20:51:27.328 ERROR 11206 --- [  restartedMain] org.apache.spark.SparkContext            : Error initializing SparkContext.

java.lang.RuntimeException: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE
    at org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:131) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:118) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.network.server.TransportServer.init(TransportServer.java:95) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.network.server.TransportServer.<init>(TransportServer.java:74) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.network.TransportContext.createServer(TransportContext.java:114) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.rpc.netty.NettyRpcEnv.startServer(NettyRpcEnv.scala:118) ~[spark-core_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:457) ~[spark-core_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:456) ~[spark-core_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2231) ~[spark-core_2.11-2.2.1.jar:2.2.1]
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160) ~[scala-library-2.11.8.jar:na]
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2223) ~[spark-core_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461) ~[spark-core_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:56) ~[spark-core_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:246) ~[spark-core_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175) ~[spark-core_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) ~[spark-core_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:432) ~[spark-core_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516) [spark-core_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:918) [spark-sql_2.11-2.2.1.jar:2.2.1]
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:910) [spark-sql_2.11-2.2.1.jar:2.2.1]
    at scala.Option.getOrElse(Option.scala:121) [scala-library-2.11.8.jar:na]
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:910) [spark-sql_2.11-2.2.1.jar:2.2.1]
    at com.csg.ipro.util.SparkCommonUtility.createSparkSessionWithHive(SparkCommonUtility.java:54) [classes/:na]
    at com.csg.ipro.util.SparkCommonUtility.getSparkSessionWithHive(SparkCommonUtility.java:28) [classes/:na]
    at com.csg.ipro.model.FileWatcher.createSession(FileWatcher.java:84) [classes/:na]
    at com.csg.ipro.service.WatcherService.startMonitoring(WatcherService.java:43) [classes/:na]
    at com.csg.ipro.IproStreamApplication.main(IproStreamApplication.java:21) [classes/:na]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_231]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_231]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_231]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_231]
    at org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49) [spring-boot-devtools-2.3.4.RELEASE.jar:2.3.4.RELEASE]
Caused by: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE
    at java.lang.Class.getDeclaredField(Class.java:2070) ~[na:1.8.0_231]
    at org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:127) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
    ... 31 common frames omitted
2020-10-27 20:51:26.963警告11206---[restartedMain]org.apache.hadoop.util.NativeCodeLoader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类
2020-10-27 20:51:27.053信息11206---[restartedMain]org.apache.spark.SparkContext:提交的申请:HdfsPoc
2020-10-27 20:51:27.328错误11206---[restartedMain]org.apache.spark.SparkContext:初始化SparkContext时出错。
java.lang.RuntimeException:java.lang.NoSuchFieldException:DEFAULT\u TINY\u CACHE\u大小
在org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:131)~[spark-network-common_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.network.util.NettyUtils.createPoolledBytebufallocator(NettyUtils.java:118)~[spark-network-common_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.network.server.TransportServer.init(TransportServer.java:95)~[spark-network-common_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.network.server.TransportServer.(TransportServer.java:74)~[spark-network-common_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.network.TransportContext.createServer(TransportContext.java:114)~[spark-network-common_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.rpc.netty.nettyrpcev.startServer(nettyrpcev.scala:118)~[spark-core_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.rpc.netty.nettyrpcevenfactory$$anonfun$4.apply(nettyrpcev.scala:457)~[spark-core_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.rpc.netty.nettyrpcevenfactory$$anonfun$4.apply(nettyrpcev.scala:456)~[spark-core_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.util.Utils$$anonfun$startServicePort$1.apply$mcVI$sp(Utils.scala:2231)~[spark-core_2.11-2.2.1.jar:2.2.1]
在scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)~[scala-library-2.11.8.jar:na]
在org.apache.spark.util.Utils$.startServicePort(Utils.scala:2223)~[spark-core_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.rpc.netty.nettyrpcevFactory.create(nettyrpcev.scala:461)~[spark-core_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:56)~[spark-core_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.SparkEnv$.create(SparkEnv.scala:246)~[spark-core_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)~[spark-core_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)~[spark-core_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.SparkContext.(SparkContext.scala:432)~[spark-core_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516)[spark-core_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:918)[spark-sql_2.11-2.2.1.jar:2.2.1]
在org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:910)[spark-sql_2.11-2.2.1.jar:2.2.1]
在scala.Option.getOrElse(Option.scala:121)[scala-library-2.11.8.jar:na]
在org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:910)[spark-sql_2.11-2.2.1.jar:2.2.1]
在com.csg.ipro.util.SparkCommonUtility.createSparkSessionWithHive(SparkCommonUtility.java:54)[classes/:na]
在com.csg.ipro.util.SparkCommonUtility.getSparkSessionWithHive(SparkCommonUtility.java:28)[classes/:na]
在com.csg.ipro.model.FileWatcher.createSession(FileWatcher.java:84)[classes/:na]
在com.csg.ipro.service.WatcherService.startMonitoring(WatcherService.java:43)[classes/:na]
在com.csg.ipro.IproStreamApplication.main(IproStreamApplication.java:21)[classes/:na]
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)~[na:1.8.0\u 231]
在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)~[na:1.8.0\u 231]
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)~[na:1.8.0\u 231]
在java.lang.reflect.Method.invoke(Method.java:498)~[na:1.8.0\u 231]
在org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49)[spring-boot-devtools-2.3.4.RELEASE.jar:2.3.4.RELEASE]
原因:java.lang.NoSuchFieldException:默认\u极小\u缓存\u大小
在java.lang.Class.getDeclaredField(Class.java:2070)~[na:1.8.0\u 231]
在org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:127)~[spark-network-common_2.11-2.2.1.jar:2.2.1]
... 省略31个公共框架
以下是pom.xml文件中添加的spark和hadoop依赖项:

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.2.1</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.2.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.11</artifactId>
            <version>2.2.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.10.0</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-api</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-hdfs-client</artifactId>
            <version>2.10.0</version>
            <scope>provided</scope>
        </dependency>

org.apache.spark
spark-core_2.11
2.2.1
org.slf4j
slf4j-log4j12
org.apache.spark
spark-sql_2.11
2.2.1
org.apache.spark
spark-hive_2.11
2.2.1
org.apache.hadoop
hadoop通用
2.10.0
org.slf4j
slf4j api
org.slf4j
slf4j-log4j12
org.apache.hadoop
hadoop hdfs客户端
2.10.0
假如

无论我在寻找解决方案时读到什么,依赖项版本都不匹配。但是我尝试了很多排列,没有一个适合我。如何解决此错误?

这是因为添加了错误版本的netty。下面为我解决了这个问题:

<dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.3.0</version>
            <exclusions>
            <exclusion>
                <groupId>io.netty</groupId>
                <artifactId>netty-all</artifactId>
            </exclusion>
            </exclusions>
        </dependency>

<dependency>
            <groupId>io.netty</groupId>
            <artifactId>netty-all</artifactId>
            <version>4.1.17.Final</version>
        </dependency>

org.apache.spark
spark-sql_2.11
2.3.0
伊奥·内蒂
讨厌的