Apache spark 如何调整PySpark shell日志级别?

Apache spark 如何调整PySpark shell日志级别?,apache-spark,pyspark,Apache Spark,Pyspark,目前我得到了一个NullPointException。我不知道如何修理它。 只是想知道我是否可以调整Python日志级别,看看是否可以从中获得更多信息。问题:如何调整PySpark的日志级别 Python 2.7.5 (default, Oct 11 2015, 17:47:16) [GCC 4.8.3 20140911 (Red Hat 4.8.3-9)] on linux2 Type "help", "copyright", "credits" or "license" for more i

目前我得到了一个NullPointException。我不知道如何修理它。 只是想知道我是否可以调整Python日志级别,看看是否可以从中获得更多信息。问题:如何调整PySpark的日志级别

Python 2.7.5 (default, Oct 11 2015, 17:47:16)
[GCC 4.8.3 20140911 (Red Hat 4.8.3-9)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
17/02/28 16:52:22 ERROR spark.SparkContext: Error initializing SparkContext.
17/02/28 16:52:22 ERROR util.Utils: Uncaught exception in thread Thread-2
java.lang.NullPointerException
        at org.apache.spark.network.shuffle.ExternalShuffleClient.close(ExternalShuffleClient.java:152)
        at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1231)
        at org.apache.spark.SparkEnv.stop(SparkEnv.scala:96)
        at org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1768)
        at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
        at org.apache.spark.SparkContext.stop(SparkContext.scala:1767)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:614)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
        at py4j.Gateway.invoke(Gateway.java:214)
        at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
        at py4j.GatewayConnection.run(GatewayConnection.java:209)
        at java.lang.Thread.run(Thread.java:745)
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/shell.py", line 43, in <module>
    sc = SparkContext(pyFiles=add_files)
  File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/context.py", line 115, in __init__
    conf, jsc, profiler_cls)
  File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/context.py", line 172, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/context.py", line 235, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 1064, in __call__
  File "/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
:
>>>
Python 2.7.5(默认,2015年10月11日,17:47:16)
[GCC 4.8.3 20140911(红帽4.8.3-9)]关于linux2
有关详细信息,请键入“帮助”、“版权”、“信用证”或“许可证”。
将默认日志级别设置为“警告”。
要调整日志记录级别,请使用sc.setLogLevel(newLevel)。
17/02/28 16:52:22错误spark.SparkContext:初始化SparkContext时出错。
17/02/28 16:52:22错误util.Utils:线程2中的未捕获异常
java.lang.NullPointerException
位于org.apache.spark.network.shuffle.ExternalShuffleClient.close(ExternalShuffleClient.java:152)
位于org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1231)
在org.apache.spark.SparkEnv.stop(SparkEnv.scala:96)
在org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1768)
位于org.apache.spark.util.Utils$.trylognonfataleror(Utils.scala:1230)
位于org.apache.spark.SparkContext.stop(SparkContext.scala:1767)
位于org.apache.spark.SparkContext(SparkContext.scala:614)
位于org.apache.spark.api.java.JavaSparkContext(JavaSparkContext.scala:59)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:526)
位于py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
位于py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
在py4j.Gateway.invoke处(Gateway.java:214)
位于py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
在py4j.commands.ConstructorCommand.execute处(ConstructorCommand.java:68)
在py4j.GatewayConnection.run处(GatewayConnection.java:209)
运行(Thread.java:745)
回溯(最近一次呼叫最后一次):
文件“/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/shell.py”,第43行,in
sc=SparkContext(pyFiles=add_文件)
文件“/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/context.py”,第115行,在__
形态、jsc、探查器(cls)
文件“/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/context.py”,第172行,在
self.\u jsc=jsc或self.\u初始化上下文(self.\u conf.\u jconf)
文件“/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/pyspark/context.py”,第235行,在上下文中
返回self._jvm.JavaSparkContext(jconf)
文件“/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py”,调用中第1064行__
文件“/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/lib/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py”,第308行,在get_return_value中
py4j.protocol.Py4JJavaError:调用None.org.apache.spark.api.java.JavaSparkContext时出错。
:
>>>

为网关创建具有更详细日志记录级别的自定义log4j文件,例如:

log4j.rootCategory=INFO, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

log4j.logger.org.apache.spark.api.python.PythonGatewayServer=DEBUG
然后按以下方式在pyspark命令中使用它:

./bin/pyspark --driver-java-options '-Dlog4j.configuration=file:log4j-debug.properties'

在研究源代码之后,“java网关”类似于“/bin/spark submit--conf xxkey=xxvalue--conf xxxkey=xxxvalue pyspark shell”。您能否帮助澄清参数“--driver java options”是如何传递到下面的“/bin/spark submit”中的?谢谢。它被重写为
--conf spark.driver.extraJavaOptions
。请参阅PYSPARK shell中的
os.environ['PYSPARK\u SUBMIT\u ARGS']