Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/sorting/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 由于阶段故障,火花单元测试失败_Scala_Apache Spark_Sbt_Scalatest - Fatal编程技术网

Scala 由于阶段故障,火花单元测试失败

Scala 由于阶段故障,火花单元测试失败,scala,apache-spark,sbt,scalatest,Scala,Apache Spark,Sbt,Scalatest,我最近将一个应用程序从Spark 1.4.1升级到了1.6.0,其中我的应用程序(在ScalaTest 3.0中)中的单元测试突然失败,这不是由于Spark中的API或行为变化造成的 奇怪的是,每次我使用sbt test运行测试时,不同的测试都会失败,并始终显示以下消息: [info] org.apache.spark.SparkException: Job aborted due to stage failure: Task 87 in stage 206.0 failed 1 times

我最近将一个应用程序从Spark 1.4.1升级到了1.6.0,其中我的应用程序(在ScalaTest 3.0中)中的单元测试突然失败,这不是由于Spark中的API或行为变化造成的

奇怪的是,每次我使用
sbt test
运行测试时,不同的测试都会失败,并始终显示以下消息:

[info]   org.apache.spark.SparkException: Job aborted due to stage failure: Task 87 in stage 206.0 failed 1 times, most recent failure: Lost task 87.0 in stage 206.0 (TID 4228, localhost): ExecutorLostFailure (executor driver exited caused by one of the running
 tasks) Reason: Executor heartbeat timed out after 148400 ms
[info] Driver stacktrace:
[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
[info]   at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
[info]   at scala.Option.foreach(Option.scala:236)
[info]   at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
[info]   ...
我在build.sbt中设置了以下内容:

javaOptions in test += "-Xmx2G"
fork in test := true
parallelExecution in test := false

所以,单元测试很好,但是有一些事情我无法确定。有人有什么想法吗?

由于此代码一直在工作,我怀疑默认内存设置(执行器或驱动程序或开销)可能已随升级而更改

请为您的应用程序id发布纱线日志。它将包含错误的更多详细信息


此外,请查看此链接以了解类似错误

由于测试在我的机器上本地运行,因此没有纱线日志。