Apache spark 触发流无法写入hdfs路径
我正在使用spark-sql-2.4.1v和使用Java1.8的kafka 0.10.xApache spark 触发流无法写入hdfs路径,apache-spark,hadoop,apache-spark-sql,hdfs,spark-structured-streaming,Apache Spark,Hadoop,Apache Spark Sql,Hdfs,Spark Structured Streaming,我正在使用spark-sql-2.4.1v和使用Java1.8的kafka 0.10.x Dataset<Row> dataSet= sparkSession .readStream() .format("kafka") .option("subscribe", INFO_TOPIC) .option("sta
Dataset<Row> dataSet= sparkSession
.readStream()
.format("kafka")
.option("subscribe", INFO_TOPIC)
.option("startingOffsets", "latest")
.option("enable.auto.commit", false)
.option("maxOffsetsPerTrigger", 1000)
.option("auto.offset.reset", "latest")
.option("failOnDataLoss", false)
.load();
StreamingQuery query = dataSet.writeStream()
.format(PARQUET_FORMAT)
.option("path", parqetFileName)
.option("checkpointLocation", checkPtLocation)
.trigger(Trigger.ProcessingTime("15 seconds"))
.start();
query.awaitTermination();
这里有什么问题以及如何修复它?您的代码中必须有
streamContext.awaitTermination()
,否则应用程序将在启动流后立即退出。您必须有streamContext.awaitTermination())
在代码中-否则应用程序将在启动流后立即退出
[DataStreamer for file /user/parquet/raw/part-00001-7cba7fa3-a98f-442d-9584-b71085b7cd82-c000.snappy.parquet] WARN org.apache.hadoop.hdfs.DataStreamer - Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1249)
at java.lang.Thread.join(Thread.java:1323)
at org.apache.hadoop.hdfs.DataStreamer.closeResponder(DataStreamer.java:980)
at org.apache.hadoop.hdfs.DataStreamer.endBlock(DataStreamer.java:630)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:807)