Apache spark 自定义log4j类在spark 2.0 EMR上不起作用

Apache spark 自定义log4j类在spark 2.0 EMR上不起作用,apache-spark,log4j,emr,amazon-emr,Apache Spark,Log4j,Emr,Amazon Emr,我试图在spark 2.0中通过EMR以json的形式编写日志。 我能够使用定制的log4j.properties文件 但是,当我尝试使用自定义类(net.logstash.log4j.JSONEventLayoutV1)将输出更改为json时,我得到了以下异常: log4j:ERROR Could not instantiate class [net.logstash.log4j.JSONEventLayoutV1]. java.lang.ClassNotFoundException: net

我试图在spark 2.0中通过EMR以json的形式编写日志。 我能够使用定制的log4j.properties文件

但是,当我尝试使用自定义类(net.logstash.log4j.JSONEventLayoutV1)将输出更改为json时,我得到了以下异常:

log4j:ERROR Could not instantiate class [net.logstash.log4j.JSONEventLayoutV1].
java.lang.ClassNotFoundException: net.logstash.log4j.JSONEventLayoutV1
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:264)
    at org.apache.log4j.helpers.Loader.loadClass(Loader.java:198)
    at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:327)
    at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:797)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
    at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
    at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
    at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:117)
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
    at org.apache.spark.executor.CoarseGrainedExecutorBackend$.initializeLogIfNecessary(CoarseGrainedExecutorBackend.scala:161)
    at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
    at org.apache.spark.executor.CoarseGrainedExecutorBackend$.log(CoarseGrainedExecutorBackend.scala:161)
    at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:172)
    at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:270)
    at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
工件“jsonevent布局”是在fat jar中组装的

有人知道如何解决这个问题吗

谢谢,
Eran

最终,这对我起了作用:

log4j.rootCategory=INFO, json

log4j.appender.json=org.apache.log4j.ConsoleAppender
log4j.appender.json.target=System.err
log4j.appender.json.layout=org.apache.hadoop.log.Log4Json
log4j.rootCategory=INFO, json

log4j.appender.json=org.apache.log4j.ConsoleAppender
log4j.appender.json.target=System.err
log4j.appender.json.layout=org.apache.hadoop.log.Log4Json