正在运行。/pyspark无法找到本地目录

正在运行。/pyspark无法找到本地目录,pyspark,terminal,Pyspark,Terminal,安装Spark后,我尝试从安装文件夹运行PySpark: opt/spark/bin/pyspark 但我得到以下错误: opt/spark/bin/pyspark: line 24: /opt/spark/bin/load-spark-env.sh: No such file or directory opt/spark/bin/pyspark: line 68: /opt/spark/bin/spark-submit: No such file or directory opt/spark

安装Spark后,我尝试从安装文件夹运行PySpark:

opt/spark/bin/pyspark
但我得到以下错误:

opt/spark/bin/pyspark: line 24: /opt/spark/bin/load-spark-env.sh: No such file or directory
opt/spark/bin/pyspark: line 68: /opt/spark/bin/spark-submit: No such file or directory
opt/spark/bin/pyspark: line 68: exec: /opt/spark/bin/spark-submit: cannot execute: No such file or directory
当我在各自的目录中看到这些项目时,为什么会发生这种情况?我还试图让PySpark作为命令独立运行,但我想我必须首先解决前一个问题


我正在macOS上运行此操作。

此错误表示未设置
SPARK\u HOME
。试试这个:

export SPARK_HOME=/opt/spark
pyspark

仅供参考,强烈建议使用软件包管理器在mac OS上安装软件,如

此错误表示未设置
SPARK\u HOME
。试试这个:

export SPARK_HOME=/opt/spark
pyspark

仅供参考,强烈建议使用软件包管理器在mac OS上安装软件,如以下配置:

export SPARK_HOME=<YOUR-PATH>/spark-2.4.4-bin-hadoop2.7
export PYTHONPATH=$SPARK_HOME/python:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.10.7-src.zip:$PYTHONPATH

以下是配置:

export SPARK_HOME=<YOUR-PATH>/spark-2.4.4-bin-hadoop2.7
export PYTHONPATH=$SPARK_HOME/python:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.10.7-src.zip:$PYTHONPATH

你是怎么安装的?您的安装似乎不正确或不完整。您能否显示类似于
tree-l2/opt/spark
的输出?您是如何安装它的?您的安装似乎不正确或不完整。您能否显示类似于
tree-l2/opt/spark
的输出?