Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 运行pyspark时没有此类文件或目录错误_Apache Spark - Fatal编程技术网

Apache spark 运行pyspark时没有此类文件或目录错误

Apache spark 运行pyspark时没有此类文件或目录错误,apache-spark,Apache Spark,我安装了spark,但当我在终端上运行pyspark时 /usr/local/Cellar/apache-spark/2.4.5_1/libexec/bin/pyspark: line 24: /Users/miguel/spark-2.3.0-bin-hadoop2.7/bin/load-spark-env.sh: No such file or directory /usr/local/Cellar/apache-spark/2.4.5_1/libexec/bin/pyspark: line

我安装了spark,但当我在终端上运行
pyspark

/usr/local/Cellar/apache-spark/2.4.5_1/libexec/bin/pyspark: line 24: /Users/miguel/spark-2.3.0-bin-hadoop2.7/bin/load-spark-env.sh: No such file or directory
/usr/local/Cellar/apache-spark/2.4.5_1/libexec/bin/pyspark: line 77: /Users/miguel/spark-2.3.0-bin-hadoop2.7/bin/spark-submit: No such file or directory
/usr/local/Cellar/apache-spark/2.4.5_1/libexec/bin/pyspark: line 77: exec: /Users/miguel/spark-2.3.0-bin-hadoop2.7/bin/spark-submit: cannot execute: No such file or directory
我尝试过再次卸载和安装(spark、java、scala),但它一直抛出这个错误。我们也搜索了这里和GitHub问题,但没有找到任何有效的

其他信息:

brew医生

(myenv) C02YH1U3FSERT:~ miguel$ brew doctor
Please note that these warnings are just used to help the Homebrew maintainers
with debugging if you file an issue. If everything you use Homebrew for is
working fine: please don't worry or file an issue; just ignore this. Thanks!

Warning: "config" scripts exist outside your system or Homebrew directories.
`./configure` scripts often look for *-config scripts to determine if
software packages are installed, and which additional flags to use when
compiling and linking.

Having additional scripts in your path can confuse software installed via
Homebrew if the config script overrides a system or Homebrew-provided
script of the same name. We found the following "config" scripts:
  /Users/miguel/.pyenv/shims/python3.7-config
  /Users/miguel/.pyenv/shims/python3.7m-config
  /Users/miguel/.pyenv/shims/python-config
  /Users/miguel/.pyenv/shims/python3-config 
brew点击

(myenv) C02YH1U3FSERT:~ miguel$ brew tap
adoptopenjdk/openjdk
homebrew/cask
homebrew/cask-versions
homebrew/core
hadoop版本

Hadoop 3.2.1
Source code repository https://gitbox.apache.org/repos/asf/hadoop.git -r b3cbbb467e22ea829b3808f4b7b01d07e0bf3842
Compiled by rohithsharmaks on 2019-09-10T15:56Z
Compiled with protoc 2.5.0
From source with checksum 776eaf9eee9c0ffc370bcbc1888737
This command was run using /usr/local/Cellar/hadoop/3.2.1_1/libexec/share/hadoop/common/hadoop-common-3.2.1.jar
echo$SPARK\u HOME

/Users/miguel/spark-2.3.0-bin-hadoop2.7
hdfs-dfs-ls

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 6 items
...

我在这方面花了相当长的时间,如果有人能指出解决方案,那就太好了。

原因是SPARK\u HOME仍然设置为旧路径。 即使使用
source~/.bash_profile
此值也不会取消设置,直到我应用:

unset SPARK_HOME

然后错误消失了。

这似乎是Hadoop设置不正确。你能在Hadoop上运行任何东西吗
ls
就足够了。我在上面添加了
hadoop版本的输出
spark-2.3.0-bin-hadoop2.7
vs
apachespark/2.4.5_1/libexec
vs
hadoop3.2.1
我认为您面临着版本不兼容的问题。基于什么您选择了这些版本?顺便说一句,这还不能说明Hadoop是否运行良好。例如,尝试使用
hdfs dfs-ls/
。您是否从
/usr/local/ceral/apache spark/2.4.5_1/libexec/bin/pyspark
运行pyspark,同时将
/Users/miguel/spark-2.3.0-bin-hadoop2.7
作为
spark\u HOME
env变量添加到帖子中。看来你对火花之家的看法是对的。我按照在线教程中的说明安装Spark。我认为您的Spark主页是
/usr/local/ceral/apache Spark/2.4.5_1/libexec
,或者从/Users/miguel/Spark-2.3.0-bin-hadoop2.7运行pyspark`这对我有用,谢谢!