Scala Apache Spark主机运行失败

Scala Apache Spark主机运行失败,scala,apache-spark,bigdata,Scala,Apache Spark,Bigdata,我试图运行C:\Spark\Spark-1.6.1-bin-hadoop2.6\sbin>start master.sh ,但出现了以下错误 我还注意到,在运行bin/sparkshell 16/04/24 23:14:41 WARN : Your hostname, Pavilion resolves to a loopback/non-reachable address: fe80:0:0:0:0:5efe:c0a8:867%net14, but we couldn't find any

我试图运行
C:\Spark\Spark-1.6.1-bin-hadoop2.6\sbin>start master.sh
,但出现了以下错误

我还注意到,在运行
bin/sparkshell

16/04/24 23:14:41 WARN : Your hostname, Pavilion resolves to a loopback/non-reachable address: fe80:0:0:0:0:5efe:c0a8:867%net14, but we couldn't find any external IP address!
这也是不合法的


有人能告诉我哪里有错误,或者我错过了正确运行Master所必需的任何设置吗?

问题在于此脚本不是为在Windows计算机上执行而设计的。请参阅

注意:启动脚本当前不支持Windows

根据经验,只有以
.cmd
结尾的脚本才能在Windows上运行。而以
.sh
结尾的脚本是为Linux和Mac OS设计的。虽然可以在Windows上手动启动Spark Master,但最好只运行
本地[*]
模式,除非您正在创建Windows计算机集群
local[*]
模式已经充分利用了本地机器的能力。

你检查过
/c/Spark/Spark-1.6.1-bin-hadoop2.6/logs/Spark--org.apache.Spark.deploy.master.master-1-Pavilion.out
?@toniedzwiedz,这里是
Spark命令:c:\Program Files\Java\jdk1.8.0\u 65\bin\Java-cp c:/Spark/Spark/conf\;C:/Spark/Spark/lib/Spark-assembly-1.6.1-hadoop2.6.0.jar;C:\Spark\Spark\lib\datanucleus-api-jdo-3.2.6.jar;C:\Spark\Spark\lib\datanucleus-core-3.2.10.jar;C:\Spark\Spark\lib\datanucleus-rdbms-3.2.9.jar-Xms1g-Xmx1g-org.apache.Spark.deploy.master.master--ip Pavilion--端口7077--webui端口8080==========================================================