为什么Intellij、maven和scala测试会出现ScalaTest dispatcher NPE错误?

为什么Intellij、maven和scala测试会出现ScalaTest dispatcher NPE错误?,scala,maven,apache-spark,intellij-idea,Scala,Maven,Apache Spark,Intellij Idea,当我尝试在本地运行spark test时,出现以下错误: Exception in thread "ScalaTest-dispatcher" java.lang.NullPointerException at org.apache.spark.sql.internal.SQLConf$$anonfun$14.apply(SQLConf.scala:133) at org.apache.spark.sql.internal.SQLConf$$anonfun$1

当我尝试在本地运行spark test时,出现以下错误:

Exception in thread "ScalaTest-dispatcher" java.lang.NullPointerException
    at org.apache.spark.sql.internal.SQLConf$$anonfun$14.apply(SQLConf.scala:133)
    at org.apache.spark.sql.internal.SQLConf$$anonfun$14.apply(SQLConf.scala:133)
    at scala.Option.map(Option.scala:146)
    at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:133)
    at org.apache.spark.sql.types.DataType.sameType(DataType.scala:88)
    at org.apache.spark.sql.catalyst.analysis.TypeCoercion$$anonfun$haveSameType$1.apply(TypeCoercion.scala:288)
    at org.apache.spark.sql.catalyst.analysis.TypeCoercion$$anonfun$haveSameType$1.apply(TypeCoercion.scala:288)
    at scala.collection.LinearSeqOptimized$class.forall(LinearSeqOptimized.scala:83)
    at scala.collection.immutable.List.forall(List.scala:84)
    at org.apache.spark.sql.catalyst.analysis.TypeCoercion$.haveSameType(TypeCoercion.scala:288)
    at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression$class.dataTypeCheck(Expression.scala:717)
    at org.apache.spark.sql.catalyst.expressions.CaseWhen.dataTypeCheck(conditionalExpressions.scala:121)
    at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression$class.dataType(Expression.scala:723)
    at org.apache.spark.sql.catalyst.expressions.CaseWhen.dataType(conditionalExpressions.scala:121)
    at org.apache.spark.sql.catalyst.expressions.Alias.toAttribute(namedExpressions.scala:176)
    at org.apache.spark.sql.catalyst.plans.logical.Project$$anonfun$output$1.apply(basicLogicalOperators.scala:52)
    at org.apache.spark.sql.catalyst.plans.logical.Project$$anonfun$output$1.apply(basicLogicalOperators.scala:52)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
    at scala.collection.AbstractTraversable.map(Traversable.scala:104)
    at org.apache.spark.sql.catalyst.plans.logical.Project.output(basicLogicalOperators.scala:52)
    at org.apache.spark.sql.catalyst.plans.logical.Join.output(basicLogicalOperators.scala:311)
    at org.apache.spark.sql.catalyst.plans.logical.Join.output(basicLogicalOperators.scala:311)
    at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$inputSet$1.apply(QueryPlan.scala:51)
    at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$inputSet$1.apply(QueryPlan.scala:51)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.immutable.List.foreach(List.scala:392)
    at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
    at scala.collection.immutable.List.flatMap(List.scala:355)
    at org.apache.spark.sql.catalyst.plans.QueryPlan.inputSet(QueryPlan.scala:51)
    at org.apache.spark.sql.catalyst.plans.QueryPlan.missingInput(QueryPlan.scala:63)
    at org.apache.spark.sql.catalyst.plans.QueryPlan.statePrefix(QueryPlan.scala:173)
    at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.statePrefix(LogicalPlan.scala:65)
    at org.apache.spark.sql.catalyst.plans.QueryPlan.simpleString(QueryPlan.scala:175)
    at org.apache.spark.sql.catalyst.plans.QueryPlan.verboseString(QueryPlan.scala:177)
    at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:551)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$generateTreeString$3.apply(TreeNode.scala:569)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$generateTreeString$3.apply(TreeNode.scala:569)
    at scala.collection.immutable.List.foreach(List.scala:392)
    at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:569)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$generateTreeString$3.apply(TreeNode.scala:569)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$generateTreeString$3.apply(TreeNode.scala:569)
    at scala.collection.immutable.List.foreach(List.scala:392)
    at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:569)
    at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:571)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$generateTreeString$3.apply(TreeNode.scala:569)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$generateTreeString$3.apply(TreeNode.scala:569)
    at scala.collection.immutable.List.foreach(List.scala:392)
    at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:569)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$generateTreeString$3.apply(TreeNode.scala:569)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$generateTreeString$3.apply(TreeNode.scala:569)
    at scala.collection.immutable.List.foreach(List.scala:392)
    at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:569)
    at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:571)
    at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:571)
    at org.apache.spark.sql.catalyst.trees.TreeNode.treeString(TreeNode.scala:475)
    at org.apache.spark.sql.catalyst.trees.TreeNode.treeString(TreeNode.scala:472)
    at org.apache.spark.sql.catalyst.trees.TreeNode.toString(TreeNode.scala:469)
    at java.lang.String.valueOf(String.java:2994)
    at java.lang.StringBuilder.append(StringBuilder.java:131)
    at scala.StringContext.standardInterpolator(StringContext.scala:125)
    at scala.StringContext.s(StringContext.scala:95)
    at org.apache.spark.sql.AnalysisException$$anonfun$1.apply(AnalysisException.scala:46)
    at org.apache.spark.sql.AnalysisException$$anonfun$1.apply(AnalysisException.scala:46)
    at scala.Option.map(Option.scala:146)
    at org.apache.spark.sql.AnalysisException.getMessage(AnalysisException.scala:46)
    at java.lang.Throwable.getLocalizedMessage(Throwable.java:392)
    at java.lang.Throwable.toString(Throwable.java:481)
    at java.lang.String.valueOf(String.java:2994)
    at scala.collection.mutable.StringBuilder.append(StringBuilder.scala:200)
    at scala.collection.TraversableOnce$$anonfun$addString$1.apply(TraversableOnce.scala:359)
    at scala.collection.Iterator$class.foreach(Iterator.scala:891)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
    at scala.collection.TraversableOnce$class.addString(TraversableOnce.scala:357)
    at scala.collection.AbstractIterator.addString(Iterator.scala:1334)
    at scala.collection.TraversableOnce$class.mkString(TraversableOnce.scala:323)
    at scala.collection.AbstractIterator.mkString(Iterator.scala:1334)
    at scala.runtime.ScalaRunTime$._toString(ScalaRunTime.scala:166)
    at scala.Some.toString(Option.scala:333)
    at java.lang.String.valueOf(String.java:2994)
    at scala.collection.mutable.StringBuilder.append(StringBuilder.scala:200)
    at scala.collection.TraversableOnce$$anonfun$addString$1.apply(TraversableOnce.scala:364)
    at scala.collection.Iterator$class.foreach(Iterator.scala:891)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
    at scala.collection.TraversableOnce$class.addString(TraversableOnce.scala:357)
    at scala.collection.AbstractIterator.addString(Iterator.scala:1334)
    at scala.collection.TraversableOnce$class.mkString(TraversableOnce.scala:323)
    at scala.collection.AbstractIterator.mkString(Iterator.scala:1334)
    at scala.runtime.ScalaRunTime$._toString(ScalaRunTime.scala:166)
    at org.scalatest.events.TestFailed.toString(Event.scala:472)
    at java.text.MessageFormat.subformat(MessageFormat.java:1280)
    at java.text.MessageFormat.format(MessageFormat.java:865)
    at java.text.Format.format(Format.java:157)
    at org.scalatest.Resources$.makeString(Resources.scala:35)
    at org.scalatest.Resources$.apply(Resources.scala:38)
    at org.scalatest.DispatchReporter$Propagator.run(DispatchReporter.scala:244)
    at java.lang.Thread.run(Thread.java:748)
我的Maven依赖项:

    <dependency>
        <groupId>org.scalatest</groupId>
        <artifactId>scalatest_2.11</artifactId>
        <version>2.1.3</version>
        <scope>test</scope>
    </dependency>

    <dependency>
        <groupId>com.holdenkarau</groupId>
        <artifactId>spark-testing-base_2.11</artifactId>
        <version>2.4.4_0.14.0</version>
        <scope>test</scope>
    </dependency>

org.scalatest

:选中复选框
将IDE生成/运行操作委派给Maven

我的问题来自union 2数据帧的火花错误,我不能,但消息没有解释

如果您有相同的问题,您可以使用本地spark会话尝试您的测试

从测试类中删除
DataFrameSuiteBase
,改为创建本地spark会话:

之前:

class A extends FunSuite with BeforeAndAfterAll with DataFrameSuiteBase  {
  
test("MethodB") {
  val df = spark.read()...

  }
}
之后:

class A extends FunSuite with BeforeAndAfterAll {
  
val spark = SparkSession.builder().master("local[*]").appName("test").GetOrCreate()

test("MethodB") {
  val df = spark.read()...

  }
}