Spark群集Python版本

Spark群集Python版本,python,python-3.x,apache-spark,environment,Python,Python 3.x,Apache Spark,Environment,我试着在一个由一个主节点和两个从节点组成的Spark集群上运行一个笔记本 在主系统中,我将python 3.6.2与anaconda一起安装,在从属系统中,我将手动安装python 3.6.2 对于每个从机,这是/etc/environment 但是,当我运行我的笔记本时,我还是会遇到这个异常- Py4JJavaError: An error occurred while calling o60.showString. : org.apache.spark.SparkException: Job

我试着在一个由一个主节点和两个从节点组成的Spark集群上运行一个笔记本

在主系统中,我将python 3.6.2与anaconda一起安装,在从属系统中,我将手动安装python 3.6.2

对于每个从机,这是/etc/environment

但是,当我运行我的笔记本时,我还是会遇到这个异常-

Py4JJavaError: An error occurred while calling o60.showString.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 4.0 failed 1 times, most recent failure: Lost task 0.0 in stage 4.0 (TID 6, localhost, executor driver): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/home/spark/anaconda3/lib/python3.6/site-packages/pyspark/python/lib/pyspark.zip/pyspark/worker.py", line 123, in main
    ("%d.%d" % sys.version_info[:2], version))
Exception: Python in worker has different version 2.7 than that in driver 3.6, PySpark cannot run with different minor versions.Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set.
你能帮帮我吗

编辑:

这些是env变量

[root@spark-worker-1 ~]# echo $SPARK_HOME
/opt/spark-2.2.0
[root@spark-worker-1 ~]# echo $PYSPARK_PYTHON
/usr/local/bin/python3.6
[root@spark-worker-1 ~]# echo $PYSPARK_DRIVER_PYTHON
python3.6

[root@spark-worker-2 ~]# echo $SPARK_HOME
/opt/spark-2.2.0
[root@spark-worker-2 ~]# echo $PYSPARK_PYTHON
/usr/local/bin/python3.6
[root@spark-worker-2 ~]# echo $PYSPARK_DRIVER_PYTHON
python3.6

请尝试以拥有从属进程的用户的身份回显这些环境变量并粘贴结果。我将编辑我的帖子。您是在启动从属进程之前还是之后设置了这些环境变量?有一种可能性是,当您启动它时,用户现在已经设置了这些变量,因此进程无法继承它们。在这种情况下,请尝试重新启动服务并进行检查。我在启动从属服务器之前设置了这些设置。还尝试重新启动实例,但没有取得积极的结果。我实际上正在使用Jupyter运行笔记本。它与指令限定_df.show()断开
[root@spark-worker-1 ~]# echo $SPARK_HOME
/opt/spark-2.2.0
[root@spark-worker-1 ~]# echo $PYSPARK_PYTHON
/usr/local/bin/python3.6
[root@spark-worker-1 ~]# echo $PYSPARK_DRIVER_PYTHON
python3.6

[root@spark-worker-2 ~]# echo $SPARK_HOME
/opt/spark-2.2.0
[root@spark-worker-2 ~]# echo $PYSPARK_PYTHON
/usr/local/bin/python3.6
[root@spark-worker-2 ~]# echo $PYSPARK_DRIVER_PYTHON
python3.6