Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/369.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java SparkSession初始化引发ExceptionInInitializeError_Java_Maven_Apache Spark_Spark Streaming_Spark Structured Streaming - Fatal编程技术网

Java SparkSession初始化引发ExceptionInInitializeError

Java SparkSession初始化引发ExceptionInInitializeError,java,maven,apache-spark,spark-streaming,spark-structured-streaming,Java,Maven,Apache Spark,Spark Streaming,Spark Structured Streaming,我试图运行一个简单的Spark结构化流媒体作业,但在SparkSession上调用getOrCreate()时出错 我创建了SparkSession,如下所示: SparkSession spark=SparkSession .builder() .appName(“CountryCount”) .master(“本地[*]”) .getOrCreate(); 使用此pom.xml: 火花流 1 罐子 11 11 3.0.0 3.2.4 1.7.30 org.apache.maven.plu

我试图运行一个简单的Spark结构化流媒体作业,但在
SparkSession
上调用
getOrCreate()
时出错

我创建了
SparkSession
,如下所示:

SparkSession spark=SparkSession
.builder()
.appName(“CountryCount”)
.master(“本地[*]”)
.getOrCreate();
使用此
pom.xml


火花流
1
罐子
11
11
3.0.0
3.2.4
1.7.30
org.apache.maven.plugins
maven阴影插件
${mvn shade.version}
org.slf4j
slf4j-log4j12
${slf4j.version}
org.apache.spark
spark-core_2.12
${spark.version}
org.apache.spark
spark-U 2.12
${spark.version}
假如
org.apache.spark
spark-sql_2.12
${spark.version}
org.apache.spark
spark-streaming-kafka-0-10_2.12
${spark.version}
org.apache.maven.plugins
maven阴影插件
${mvn shade.version}
阴凉处
但是,我得到以下例外情况:

Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
    at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370)
    at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:442)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
    at scala.Option.getOrElse(Option.scala:189)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
    at JobCountryCount.createJob(JobCountryCount.java:43)
    at JobCountryCount.<init>(JobCountryCount.java:27)
    at JobCountryCount.main(JobCountryCount.java:21)
Caused by: java.lang.NullPointerException
    at org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
    at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
    at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
    ... 14 more
线程“main”java.lang.ExceptionInInitializeError中的异常 位于org.apache.spark.storage.BlockManagerMasterEndpoint.(BlockManagerMasterEndpoint.scala:93) 在org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370) 在org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311) 位于org.apache.spark.SparkEnv$.create(SparkEnv.scala:359) 在org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)上 位于org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) 位于org.apache.spark.SparkContext(SparkContext.scala:442) 位于org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555) 在org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930) 位于scala.Option.getOrElse(Option.scala:189) 位于org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921) 在JobCountryCount.createJob(JobCountryCount.java:43) 在JobCountryCount。(JobCountryCount.java:27) 在JobCountryCount.main(JobCountryCount.java:21) 原因:java.lang.NullPointerException 位于org.apache.commons.lang3.SystemUtils.IsJavaVersionAtlast(SystemUtils.java:1654) 位于org.apache.spark.storage.StorageUtils$(StorageUtils.scala:207) 位于org.apache.spark.storage.StorageUtils$(StorageUtils.scala) ... 14多
提前谢谢你

类路径上的Apache commons lang库的版本低于3.8,不支持JDK11。看


由于ApacheSpark3.0.0正在使用,我的预感是您的环境可能也有一个旧的Spark(或Hadoop)版本。您可以在代码中打印[org.apache.commons.lang3.SystemUtils].getResource(“SystemUtils.class”)的
classOf。它将告诉您类的来源。

类路径上的Apache commons lang库版本低于3.8,不支持JDK11。看

由于ApacheSpark3.0.0正在使用,我的预感是您的环境可能也有一个旧的Spark(或Hadoop)版本。您可以在代码中打印[org.apache.commons.lang3.SystemUtils].getResource(“SystemUtils.class”)
classOf。它会告诉你这门课是从哪里来的