Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/18.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/selenium/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark Spark Streaming:在>;写入记录:BatchAllocationEvent_Apache Spark_Spark Streaming - Fatal编程技术网

Apache spark Spark Streaming:在>;写入记录:BatchAllocationEvent

Apache spark Spark Streaming:在>;写入记录:BatchAllocationEvent,apache-spark,spark-streaming,Apache Spark,Spark Streaming,我用以下代码关闭了Spark StreamingContext 实际上,线程监视布尔开关,然后调用StreamingContext.stop(true,true) 一切似乎都在进行中,我的所有数据似乎都已收集。但是,我在关机时遇到以下异常 我可以忽略吗?看起来有可能会丢失数据 18/03/07 11:46:40 WARN ReceivedBlockTracker:运行时引发异常 写入记录:BatchAllocationEvent(1520452000000 ms,分配块(映射(0->ArrayB

我用以下代码关闭了Spark StreamingContext

实际上,线程监视布尔开关,然后调用StreamingContext.stop(true,true)

一切似乎都在进行中,我的所有数据似乎都已收集。但是,我在关机时遇到以下异常

我可以忽略吗?看起来有可能会丢失数据

18/03/07 11:46:40 WARN ReceivedBlockTracker:运行时引发异常 写入记录:BatchAllocationEvent(1520452000000 ms,分配块(映射(0->ArrayBuffer())到WriteHeadLog。 java.lang.IllegalStateException:调用了close() BatchedWriteAheadLog在写入请求之前,时间为152045200001 可以实现。 位于org.apache.spark.streaming.util.BatchedWriteAheadLog.write(BatchedWriteAheadLog.scala:86) 位于org.apache.spark.streaming.scheduler.ReceivedBlockTracker.writeToLog(ReceivedBlockTracker.scala:234) 位于org.apache.spark.streaming.scheduler.ReceivedBlockTracker.allocateBlocksToBatch(ReceivedBlockTracker.scala:118) 位于org.apache.spark.streaming.scheduler.ReceiverTracker.allocateBlocksToBatch(ReceiverTracker.scala:213) 位于org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$3.apply(JobGenerator.scala:248)

线

小溪


我也有同样的问题,并调用close()而不是stop来修复它

var stopScc=false

private def stopSccThread(): Unit = {
val thread = new Thread {
  override def run {

    var continueRun=true
    while (continueRun) {
      logger.debug("Checking status")
      if (stopScc == true) {
        getSparkStreamingContext(fieldVariables).stop(true, true)
        logger.info("Called Stop on Streaming Context")
        continueRun=false


      }
      Thread.sleep(50)
    }
  }
}
thread.start

}
@throws(classOf[IKodaMLException])
def startStream(ip: String, port: Int): Unit = {

try {
  val ssc = getSparkStreamingContext(fieldVariables)
  ssc.checkpoint("./ikoda/cp")

  val lines = ssc.socketTextStream(ip, port, StorageLevel.MEMORY_AND_DISK_SER)
  lines.print


  val lmap = lines.map {
    l =>

      if (l.contains("IKODA_END_STREAM")) {
        stopScc = true
      }
      l
  }


  lmap.foreachRDD {
    r =>
      if (r.count() > 0) {
        logger.info(s"RECEIVED: ${r.toString()} first: ${r.first().toString}")
        r.saveAsTextFile("./ikoda/test/test")
      }
      else {
        logger.info("Empty RDD. No data received")
      }
  }
  ssc.start()

  ssc.awaitTermination()
}
catch {
  case e: Exception =>
    logger.error(e.getMessage, e)
    throw new IKodaMLException(e.getMessage, e)
}