Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Macos 安装Apache Spark后,无法从Mac上的任何目录运行pyspark命令_Macos_Apache Spark_Pyspark - Fatal编程技术网

Macos 安装Apache Spark后,无法从Mac上的任何目录运行pyspark命令

Macos 安装Apache Spark后,无法从Mac上的任何目录运行pyspark命令,macos,apache-spark,pyspark,Macos,Apache Spark,Pyspark,我已经在我的Mac上安装了spark,按照书中的说明:“ApacheSpark 24小时”。当我在spark目录中时,我可以使用以下命令运行pyspark: ./bin/pyspark 要安装spark,我创建了env变量: export SPARK_HOME=/opt/spark export JAVA_HOME=$(/usr/libexec/java_home) 将其添加到路径: export PATH=$SPARK_HOME/bin:$PATH 书中说我应该能够从任何目录运行“py

我已经在我的Mac上安装了spark,按照书中的说明:“ApacheSpark 24小时”。当我在spark目录中时,我可以使用以下命令运行pyspark:

./bin/pyspark
要安装spark,我创建了env变量:

export SPARK_HOME=/opt/spark
export JAVA_HOME=$(/usr/libexec/java_home)
将其添加到路径:

export PATH=$SPARK_HOME/bin:$PATH
书中说我应该能够从任何目录运行“pyspark”或“sparkshell”命令,但它不起作用:

pyspark: command not found
我遵循了其他人在此处提出的类似问题的说明:

我设置了JAVA_HOME env变量:

export SPARK_HOME=/opt/spark
export JAVA_HOME=$(/usr/libexec/java_home)
我还运行了以下命令:

export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH
当我运行env命令时,这是输出:

SPARK_HOME=/opt/spark
TERM_PROGRAM=Apple_Terminal
SHELL=/bin/bash
TERM=xterm-256color
TMPDIR=/var/folders/hq/z0wh5c357cbgp1dh33lfhjj40000gn/T/
Apple_PubSub_Socket_Render=/private/tmp/com.apple.launchd.fJdtLqZ7dN/Render
TERM_PROGRAM_VERSION=361.1
TERM_SESSION_ID=A8BD2144-72AD-402C-A591-5C8A43DD398B
USER=richardgray
SSH_AUTH_SOCK=/private/tmp/com.apple.launchd.cQeqaF2v1z/Listeners
__CF_USER_TEXT_ENCODING=0x1F5:0x0:0x0
PATH=/opt/spark/bin:/Library/Frameworks/Python.framework/Versions/3.5/bin:  /Library/Frameworks/Python.framework/Versions/3.5/bin:/Library/Frameworks/Python.framework/Versions/2.7/bin:/usr/local/heroku/bin:/Users/richardgray/.rbenv/shims:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/usr/X11/bin
PWD=/Users/richardgray
JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home
LANG=en_GB.UTF-8
XPC_FLAGS=0x0
XPC_SERVICE_NAME=0
SHLVL=1
HOME=/Users/richardgray
PYTHONPATH=/opt/spark/python/lib/py4j-0.9-src.zip:/opt/spark/python/:
LOGNAME=richardgray
_=/usr/bin/env
我有什么遗漏吗?提前谢谢。

是你写的

当我在spark目录中时,我可以使用 命令:
/bin/pyspark

您创建了
export SPARK\u HOME=/opt/SPARK

请确认
spark目录
确实是
/opt/spark

如果您从
/Users/richardgray/opt/spark/bin执行spark,请设置:

export SPARK_HOME=/Users/richardgray/opt/spark
其次是:

export PATH=$SPARK_HOME/bin:$PATH

注意:如果它解决了您的问题,您需要将这两个导出添加到您的登录脚本(例如,
.profile
),以便自动设置路径

ls/opt/spark/bin
,并查看可用的二进制文件您可以使用自制,顺便说一句,spark确实位于/opt/spark:LICENSE conf metastore_db NOTICE data python R derby.log sbin README.md examples scala-2.12.1.deb RELEASE jars spark warehouse bin licenses yarth这是我的主目录中的位置,即richardgray/opt/sparkTry使用:export spark_home=/Users/richardgray/opt/spark,然后更新路径env