如何在log4j日志文件-Scala的文件名中附加Spark ApplicationID

如何在log4j日志文件-Scala的文件名中附加Spark ApplicationID,scala,apache-spark,log4j,Scala,Apache Spark,Log4j,我正在尝试将Spark应用程序ID附加到log4j日志文件的文件名。下面是log4j.properties文件 log4j.rootLogger=info,file # Redirect log messages to console log4j.appender.stdout=org.apache.log4j.ConsoleAppender log4j.appender.stdout.Target=System.out log4j.appender.stdout.layout=org.apa

我正在尝试将Spark应用程序ID附加到log4j日志文件的文件名。下面是log4j.properties文件

log4j.rootLogger=info,file

# Redirect log messages to console
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L -%m%n

# Redirect log messages to log file, support file rolling
log4j.appender.file=org.apache.log4j.rolling.RollingFileAppender
log4j.appender.file.rollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy
log4j.appender.file.rollingPolicy.FileNamePattern=log4j//Data_Quality.%d{yyyy-MM-dd}.log
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L -%m%n

# set the immediate flush to true
log4j.appender.FILE.ImmediateFlush=true

# set the threshold to debug mode INFO
log4j.appender.FILE.Threshold=INFO

#Set the append to false, overwrite
log4j.appender.FILE.Append=true  
Spark提交命令:

spark2-submit -conf "spark.driver.extraJavaOptions=-Dconfig.file=./input.conf -Dlog4j.configuration=log4j.properties" --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j.properties" --files "input.conf,log4j.properties" --master yarn --class "DataCheckImplementation" Data_quality.jar
创建的日志文件名为:Data_Quality.2020-07-21.log,该文件工作正常

我想将Spark ApplicationID添加到文件名 预期文件名:数据质量(ApplicationID).2020-07-21.log

示例:数据质量(应用程序)。2020-07-21.log


可能吗?需要帮助

我不知道/认为这是否可以在配置级别实现(例如lo4j.properties等),但我们可以通过一些方法实现这一点。以下是一种方法:

您需要有一个logger类/特性,用于处理所有的logger管理,例如:

trait SparkContextProvider {
 def spark: SparkSession
}

trait Logger extends SparkContextProvider {

lazy val log = Logger.getLogger(...)

lazy val applicationId = spark.sparkContext.applicationId

val appender = new RollingFileAppender();
        appender.setAppend(true);
        appender.setMaxFileSize("1MB");
        appender.setMaxBackupIndex(1);
        appender.setFile("Data_Quality" + applicationId + "_" + dateFormat.format(date) + ".log");
        appender.activateOptions();

        val layOut = new PatternLayout();
        layOut.setConversionPattern("%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n");
        appender.setLayout(layOut);

        log.addAppender(appender);
}