Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/19.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 原因:java.lang.ArrayIndexOutOfBoundsException火花flatMap_Scala_Apache Spark - Fatal编程技术网

Scala 原因:java.lang.ArrayIndexOutOfBoundsException火花flatMap

Scala 原因:java.lang.ArrayIndexOutOfBoundsException火花flatMap,scala,apache-spark,Scala,Apache Spark,相关代码如下: val cateList = featureData.map{ case (psid: String, label: String, cate_features: ParArray[String], media_features: String) => val pair_feature = cate_features.zipWithIndex.map(x => (x._2, x._1)) pair_feature }.flatMap(_.t

相关代码如下:

val cateList = featureData.map{
  case (psid: String, label: String, cate_features: ParArray[String], media_features: String) =>
      val pair_feature = cate_features.zipWithIndex.map(x => (x._2, x._1))
      pair_feature
}.flatMap(_.toList)
平面图有错,怎么了?哪里有问题

完整错误信息:

val cateList = featureData.map{
  case (psid: String, label: String, cate_features: ParArray[String], media_features: String) =>
      val pair_feature = cate_features.zipWithIndex.map(x => (x._2, x._1))
      pair_feature
}.flatMap(_.toList)
它主要发布错误java.lang.ArrayIndexOutOfBoundsException。我想有个地方可以去参观。我不熟悉scala。欢迎任何帮助。tks

17/01/23 12:23:08 INFO scheduler.TaskSetManager: Lost task 29.3 in stage 0.0 (TID 53) on executor 10.39.2.232: java.lang.ArrayIndexOutOfBoundsException (3) [duplicate 9]
17/01/23 12:23:08 ERROR scheduler.TaskSetManager: Task 29 in stage 0.0 failed 4 times; aborting job 
17/01/23 12:23:08 INFO cluster.YarnScheduler: Cancelling stage 0
17/01/23 12:23:08 INFO cluster.YarnScheduler: Stage 0 was cancelled
17/01/23 12:23:08 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (flatMap at ETL.scala:132) failed in 27.635 s
17/01/23 12:23:08 INFO scheduler.DAGScheduler: Job 0 failed: reduce at ETL.scala:205, took 27.763709 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 29 in stage 0.0 failed 4 times, most recent failure: Lost task 29.3 in stage 0.0 (TID 53, 10.39.2.232): java.lang.ArrayIndexOutOfBoundsException: 3    at com.sina.adalgo.feature.ETL$$anonfun$11$$anonfun$13.apply(ETL.scala:111)
at com.sina.adalgo.feature.ETL$$anonfun$11$$anonfun$13.apply(ETL.scala:111)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:283)
at org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)
at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Spark UI错误信息

我发布如下:
在不知道代码的更多细节的情况下,我发现了两个可能的地方

首先是
featureData
的模式匹配。在进行映射时,您尝试匹配模式
(psid:String,label:String,cate\u features:ParArray[String],media\u features:String)
,但未能处理与模式不匹配的模式


其次是
toList
方法。我想这是API中的一个方法。你能试试这个吗:
flatMap(x=>List(x.\u 1,x.\u 2))
-我在飞行中写的,没有测试。

请发布完整的错误输出。错误到底在哪里?为什么您认为这段代码会导致错误?我没有看到任何可能导致它的东西。spark ui信息已发布,显示了一些错误。thksCan您是否尝试不使用
ParArray
并在失败时运行测试?这是第一种模式,因为存在脏数据。tks