';JavaPackage';对象不可调用pyspark 2.3.0 Anaconda Win10

';JavaPackage';对象不可调用pyspark 2.3.0 Anaconda Win10,pyspark,anaconda,py4j,Pyspark,Anaconda,Py4j,我从pySpark开始。我已经在Win10的anadonda中安装了它。我复制了一个示例,当我执行代码时,出现以下错误: Traceback (most recent call last): File ".\testingSpark.py", line 7, in <module> spark = SparkSession.builder.master("local").getOrCreate() File "D:\Windows\Anaconda3\lib\site-package

我从pySpark开始。我已经在Win10的anadonda中安装了它。我复制了一个示例,当我执行代码时,出现以下错误:

Traceback (most recent call last):
File ".\testingSpark.py", line 7, in <module>
spark = SparkSession.builder.master("local").getOrCreate()
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\sql\session.py", line 173, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\context.py", line 331, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\context.py", line 118, in __init__
conf, jsc, profiler_cls)
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\context.py", line 188, in _do_init
self._javaAccumulator = self._jvm.PythonAccumulatorV2(host, port)
TypeError: 'JavaPackage' object is not callable
回溯(最近一次呼叫最后一次):
文件“\testingSpark.py”,第7行,在
spark=SparkSession.builder.master(“本地”).getOrCreate()
文件“D:\Windows\Anaconda3\lib\site packages\pyspark\sql\session.py”,第173行,在getOrCreate中
sc=SparkContext.getOrCreate(sparkConf)
文件“D:\Windows\Anaconda3\lib\site packages\pyspark\context.py”,第331行,在getOrCreate中
SparkContext(conf=conf或SparkConf())
文件“D:\Windows\Anaconda3\lib\site packages\pyspark\context.py”,第118行,在\uuu init中__
形态、jsc、探查器(cls)
文件“D:\Windows\Anaconda3\lib\site packages\pyspark\context.py”,第188行,在\u do\u init中
self.\u javaAccumerator=self.\u jvm.pythonacumeratorv2(主机、端口)
TypeError:“JavaPackage”对象不可调用

我已经读过了,但是我找不到任何东西来修复这个错误。请帮帮我

你找到解决办法了吗??我面临同样的问题