Apache spark Py4JError:SparkConf在JVM中不存在

Apache spark Py4JError:SparkConf在JVM中不存在,apache-spark,pyspark,Apache Spark,Pyspark,我正在运行pyspark,但它有时会不稳定。有几次它在这个命令下崩溃 spark_conf = SparkConf() 显示以下错误消息 File "/home/user1/spark/spark-1.5.1-bin-hadoop2.6/python/pyspark/conf.py", line 106, in __init__ self._jconf = _jvm.SparkConf(loadDefaults) File "/home/user1/spark/spark-

我正在运行pyspark,但它有时会不稳定。有几次它在这个命令下崩溃

spark_conf = SparkConf()
显示以下错误消息

     File "/home/user1/spark/spark-1.5.1-bin-hadoop2.6/python/pyspark/conf.py", line 106, in __init__
self._jconf = _jvm.SparkConf(loadDefaults)
     File "/home/user1/spark/spark-1.5.1-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 772, in __getattr__
raise Py4JError('{0} does not exist in the JVM'.format(name))
     Py4JError: SparkConf does not exist in the JVM

你知道有什么问题吗?谢谢你的帮助

pyspark上下文中不存在SparkConf,请尝试:

from pyspark import SparkConf

在pyspark控制台或代码中。

谢谢,我发现了问题。关闭SparkContext后,当我尝试调用SparkConf()并再次初始化新的SparkContext时,会收到上面的错误消息。你是什么意思?在使用from pyspark import SparkContext,然后使用sc=SparkContext()时,我遇到了相同的错误