Apache spark 火花停止后Azkaban作业不会完成

Apache spark 火花停止后Azkaban作业不会完成,apache-spark,azkaban,Apache Spark,Azkaban,我在azkaban中运行spark作业,在我使用com.lucidworks.spark SparkSupport将索引转换为solr后,azkaban中的作业不会finsh 更改前的最后日志行: 22-04-2021 15:08:57 CEST indexer INFO - zkClient has connected 22-04-2021 15:08:57 CEST indexer INFO - Updated live nodes from ZooKeeper... (0) -> (

我在azkaban中运行spark作业,在我使用com.lucidworks.spark SparkSupport将索引转换为solr后,azkaban中的作业不会finsh

更改前的最后日志行:

22-04-2021 15:08:57 CEST indexer INFO - zkClient has connected
22-04-2021 15:08:57 CEST indexer INFO - Updated live nodes from ZooKeeper... (0) -> (2)
22-04-2021 15:08:57 CEST indexer INFO - Cluster at x.x.x.x,x.x.x.x,x.x.x.x ready
22-04-2021 15:08:57 CEST indexer INFO - EventThread shut down
22-04-2021 15:08:57 CEST indexer INFO - Stopped Spark@5626d18c{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
22-04-2021 15:08:58 CEST indexer INFO - Process completed successfully in 443 seconds.
22-04-2021 15:08:58 CEST indexer INFO - output properties file=/home/spark/azkaban-solo-server-3.48.0-4-g9a42cb20/executions/2324/importer2/indexer_output_1529571608051881978_tmp
22-04-2021 15:08:58 CEST indexer INFO - Finishing job indexer at 1619096938782 with status SUCCEEDED
更改后的最后日志行:

26-04-2021 11:17:11 CEST indexer INFO - zkClient has connected
26-04-2021 11:17:11 CEST indexer INFO - Updated live nodes from ZooKeeper... (0) -> (2)
26-04-2021 11:17:11 CEST indexer INFO - Cluster at x.x.x.x ready
26-04-2021 11:17:14 CEST indexer INFO - Opened connection [connectionId{localValue:6, serverValue:8963484}] to 195.201.220.241:27017
26-04-2021 11:17:22 CEST indexer INFO - Stopped Spark@45e9b12d{HTTP/1.1,[http/1.1]}{0.0.0.0:4041}
spark作业已经完成并完全完成,我在spark master UI中看到了这一点,但在azkaban中,该进程仍然处于活动状态,并且只在schedule max运行时终止。 我不知道为什么,你能帮我吗

多谢各位