Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/19.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache 创建SparkSession时getOrCreate上出现空指针异常_Apache_Scala_Apache Spark Sql - Fatal编程技术网

Apache 创建SparkSession时getOrCreate上出现空指针异常

Apache 创建SparkSession时getOrCreate上出现空指针异常,apache,scala,apache-spark-sql,Apache,Scala,Apache Spark Sql,我试图在REPL(在Linux x86_64上)上创建spark会话,并得到以下错误 你能给个建议吗 >echo $PATH /opt/installations/spark-2.2.0-bin-hadoop2.7/bin:/opt/local/installations/scala-2.11.8/bin:/opt/env/java/latest/bin:/opt/env/java/latest/jre/bin:/opt/env/oracle/latest/bin:/opt/env/ora

我试图在REPL(在Linux x86_64上)上创建spark会话,并得到以下错误

你能给个建议吗

>echo $PATH
/opt/installations/spark-2.2.0-bin-hadoop2.7/bin:/opt/local/installations/scala-2.11.8/bin:/opt/env/java/latest/bin:/opt/env/java/latest/jre/bin:/opt/env/oracle/latest/bin:/opt/env/oracle/latest/network/admin:/opt/jre/1.6.0_81l64/bin:/bin:/usr/bin:/sbin:/usr/sbin:/usr/local/bin:/usr/X11R6/bin:/usr/local/sbin:.
在使用getOrCreate函数之前,我可以一直运行

scala> SparkSession
res0: org.apache.spark.sql.SparkSession.type = org.apache.spark.sql.SparkSession$@47af7f3d

scala> SparkSession.builder
res1: org.apache.spark.sql.SparkSession.Builder = org.apache.spark.sql.SparkSession$Builder@6aeb35e6

scala> SparkSession.builder.appName("abcd")
    res2: org.apache.spark.sql.SparkSession.Builder = org.apache.spark.sql.SparkSession$Builder@40a4337a

scala> SparkSession.builder.appName("abcd").master("local")
res3: org.apache.spark.sql.SparkSession.Builder = org.apache.spark.sql.SparkSession$Builder@7d20d0b

scala> SparkSession.builder.appName("abcd").master("local").getOrCreate()
java.lang.NullPointerException
  at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:121)
  at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
  at org.apache.spark.SparkContext$.initializeLogIfNecessary(SparkContext.scala:2431)
  at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
  at org.apache.spark.SparkContext$.log(SparkContext.scala:2431)
  at org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
  at org.apache.spark.SparkContext$.logWarning(SparkContext.scala:2431)
  at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$4.apply(SparkContext.scala:2489)
  at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$4.apply(SparkContext.scala:2480)
  at scala.Option.foreach(Option.scala:257)
  at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2480)
  at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2557)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:85)
  at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
  ... 32 elided
scala>SparkSession
res0:org.apache.spark.sql.SparkSession.type=org.apache.spark.sql.SparkSession$@47af7f3d
scala>SparkSession.builder
res1:org.apache.spark.sql.SparkSession.Builder=org.apache.spark.sql.SparkSession$Builder@6aeb35e6
scala>SparkSession.builder.appName(“abcd”)
res2:org.apache.spark.sql.SparkSession.Builder=org.apache.spark.sql.SparkSession$Builder@40a4337a
scala>SparkSession.builder.appName(“abcd”).master(“本地”)
res3:org.apache.spark.sql.SparkSession.Builder=org.apache.spark.sql.SparkSession$Builder@7d20d0b
scala>SparkSession.builder.appName(“abcd”).master(“local”).getOrCreate()
java.lang.NullPointerException
位于org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:121)
在org.apache.spark.internal.Logging$class.initializeLogif上(Logging.scala:102)
在org.apache.spark.SparkContext$.initializeLogif上(SparkContext.scala:2431)
位于org.apache.spark.internal.Logging$class.log(Logging.scala:46)
位于org.apache.spark.SparkContext$.log(SparkContext.scala:2431)
位于org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
位于org.apache.spark.SparkContext$.logWarning(SparkContext.scala:2431)
在org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$4.apply上(SparkContext.scala:2489)
在org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$4.apply上(SparkContext.scala:2480)
位于scala.Option.foreach(Option.scala:257)
位于org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2480)
位于org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2557)
位于org.apache.spark.SparkContext(SparkContext.scala:85)
位于org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
位于org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
位于org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
位于scala.Option.getOrElse(Option.scala:121)
位于org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
... 32删去

您是从spark shell执行此操作的吗?从堆栈跟踪中,我们可以看到Spark试图记录警告,因为多个Spark上下文同时运行。
NullPointerException
似乎来自对Log4jHi@AlexandreDupriez的调用,我正在使用$scala_HOME/bin内部的scala命令进入REPL。对于log4j问题,您有什么建议?也许您可以使用spark shell来代替?脚本
sparkshell
应位于
/opt/installations/spark-2.2.0-bin-hadoop2.7/bin
目录中,该目录位于
$PATH
上。shell启动时将为您创建新的Spark上下文。已能够从Spark shell运行。。你知道以前的方法有什么问题吗?所有spark jar都位于SCALA_HOME/lib文件夹中,包括log4j。仍然是NPE错误。很不幸,我无法从堆栈跟踪中理解记录器初始化的错误-我需要一个小的可复制用例来调查。您是从spark shell执行此操作的吗?从堆栈跟踪中,我们可以看到Spark试图记录警告,因为多个Spark上下文同时运行。
NullPointerException
似乎来自对Log4jHi@AlexandreDupriez的调用,我正在使用$scala_HOME/bin内部的scala命令进入REPL。对于log4j问题,您有什么建议?也许您可以使用spark shell来代替?脚本
sparkshell
应位于
/opt/installations/spark-2.2.0-bin-hadoop2.7/bin
目录中,该目录位于
$PATH
上。shell启动时将为您创建新的Spark上下文。已能够从Spark shell运行。。你知道以前的方法有什么问题吗?所有spark jar都位于SCALA_HOME/lib文件夹中,包括log4j。尽管如此,NPE errorWell很不幸,我无法从堆栈跟踪中理解记录器初始化的错误——我需要一个小的可复制用例来调查。