Apache spark 如何使pyspark在没有这些警告的情况下平稳运行?

Apache spark 如何使pyspark在没有这些警告的情况下平稳运行?,apache-spark,pyspark,Apache Spark,Pyspark,我安装了spark,并通过检查anaconda提示符中的“bin\pyspark”使pyspark工作 我确实安装了winutils.exe 然而,它给了我一个警告: (base) C:\Users\Admin\Desktop\spark\spark-2.4.6-bin-hadoop2.6>bin\pyspark Python 3.7.6 (default, Jan 8 2020, 20:23:39) [MSC v.1916 64 bit (AMD64)] :: Anaconda, In

我安装了spark,并通过检查anaconda提示符中的“bin\pyspark”使pyspark工作

我确实安装了winutils.exe

然而,它给了我一个警告:

(base) C:\Users\Admin\Desktop\spark\spark-2.4.6-bin-hadoop2.6>bin\pyspark
Python 3.7.6 (default, Jan  8 2020, 20:23:39) [MSC v.1916 64 bit (AMD64)] :: Anaconda, Inc. on win32
Type "help", "copyright", "credits" or "license" for more information.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/C:/Users/Admin/Desktop/spark/spark-2.4.6-bin-hadoop2.6/jars/spark-unsafe_2.11-2.4.6.jar) to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
20/06/16 03:48:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 2.4.6
      /_/

Using Python version 3.7.6 (default, Jan  8 2020 20:23:39)
SparkSession available as 'spark'.
如何消除此警告?

转到此文件“org/apache/spark/log4j defaults.properties”并更改/添加以下属性

log4j.rootCategory=ERROR, console

我觉得这只会让警告变得沉默