pysparkshell可以工作,但python不能->;皮斯帕克

pysparkshell可以工作,但python不能->;皮斯帕克,python,apache-spark,pyspark,Python,Apache Spark,Pyspark,我对Pypark有问题。当我从命令行启动pysparkshell时,我能够运行集群: pyspark --total-executor-cores 5 --executor-memory 3g 但当我运行python并尝试使用以下代码启动集群时: from pyspark import SparkConf from pyspark import SparkContext conf = SparkConf() \ .setAppName('PySparkShell') \ .setMaster

我对Pypark有问题。当我从命令行启动pysparkshell时,我能够运行集群:

pyspark --total-executor-cores 5 --executor-memory 3g
但当我运行python并尝试使用以下代码启动集群时:

from pyspark import SparkConf
from pyspark import SparkContext

conf = SparkConf() \
.setAppName('PySparkShell') \
.setMaster('url_to_cluster') \
.set('spark.executor.memory', '2g') \
.set('spark.cores.max', '6') \
.set('spark.sql.catalogImplementation', 'hive') \
.set('spark.submit.deployMode', 'client') \
.set('spark.executor.id', 'driver') \
.set('spark.rdd.compress', 'True') \
.set('spark.serializer.objectStreamReset', '100') \
.set('spark.ui.showConsoleProgress', 'true')

sc = SparkContext(conf = conf)


我得到以下问题:

ERROR TransportRequestHandler:193 - Error while invoking RpcHandler#receive() on RPC id 6381742667596359353
    java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId; local class incompatible: stream classdesc serialVersionUID = 6155820641931972170, local class serialVersionUID = -3720498261147521052

有人有这方面的经验吗?我在网上找不到像这样的问题

所有地方的spark版本都一样吗?所有地方的spark版本都一样吗?