Apache spark 如何正确设置spark.driver.log.dfsDir参数?

Apache spark 如何正确设置spark.driver.log.dfsDir参数?,apache-spark,Apache Spark,如何正确设置此spark.driver.log.dfsDir My spark-defaults.conf: spark.eventLog.dir hdfs://namenode:9000/shared/spark-logs spark.history.fs.logDirectory hdfs://namenode:9000/shared/spark-logs spark.history.fs.update.interval 30s spark.hi

如何正确设置此spark.driver.log.dfsDir

My spark-defaults.conf:

spark.eventLog.dir                   hdfs://namenode:9000/shared/spark-logs
spark.history.fs.logDirectory    hdfs://namenode:9000/shared/spark-logs
spark.history.fs.update.interval   30s
spark.history.ui.port             8099
spark.history.fs.cleaner.enabled   true
spark.history.fs.cleaner.maxAge    30d
spark.driver.log.persistToDfs.enabled true
spark.driver.log.dfsDir            hdfs://namenode:9000/shared/driver-logs
在spark驱动程序上使用spark submit时,出现以下错误

21/05/19 15:05:34错误DriverLogger:无法将驱动程序日志持久化到dfs java.lang.IllegalArgumentException:Pathname/home/app/odm spark/hdfs:/namenode:9000/shared/driver-logs from/home/app/odm spark/hdfs:/namenode:9000/shared/driver-logs不是有效的DFS文件名

为什么它会将应用程序位置作为URL的前缀


使用Spark 3.1.1进行设置的正确方法是:

spark.driver.log.dfsDir           /shared/driver-logs