Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 无法启动Spark';s";全部启动。sh“;关于EC2(rhel7)_Apache Spark_Amazon Ec2_Rhel7 - Fatal编程技术网

Apache spark 无法启动Spark';s";全部启动。sh“;关于EC2(rhel7)

Apache spark 无法启动Spark';s";全部启动。sh“;关于EC2(rhel7),apache-spark,amazon-ec2,rhel7,Apache Spark,Amazon Ec2,Rhel7,我试图通过在EC2实例(RHEL 7)中触发/sbin/start all.sh来运行独立的Spark-2.1.1。每当它运行时,它都会请求root@localhost的密码,甚至很难。我已经给出了正确的密码,它会让我-root@localhost的密码:localhost:权限被拒绝,请重试。错误 当我在控制台中点击jps时,我可以看到主机正在运行 root@localhost# jps 27863 Master 28093 Jps 我进一步检查了日志,发现了这个- Using Spark

我试图通过在EC2实例(RHEL 7)中触发
/sbin/start all.sh
来运行独立的Spark-2.1.1。每当它运行时,它都会请求root@localhost的密码,甚至很难。我已经给出了正确的密码,它会让我-
root@localhost的密码:localhost:权限被拒绝,请重试。
错误

当我在控制台中点击
jps
时,我可以看到主机正在运行

root@localhost# jps 
27863 Master
28093 Jps
我进一步检查了日志,发现了这个-

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/06/12 15:36:15 INFO Master: Started daemon with process name: 27863@localhost.org.xxxxxxxxx.com
17/06/12 15:36:15 INFO SignalUtils: Registered signal handler for TERM
17/06/12 15:36:15 INFO SignalUtils: Registered signal handler for HUP
17/06/12 15:36:15 INFO SignalUtils: Registered signal handler for INT
17/06/12 15:36:15 WARN Utils: Your hostname, localhost.org.xxxxxxxxx.com resolves to a loopback address: 127.0.0.1; using localhost ip instead (on interface eth0)
17/06/12 15:36:15 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
17/06/12 15:36:16 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/06/12 15:36:16 INFO SecurityManager: Changing view acls to: root
17/06/12 15:36:16 INFO SecurityManager: Changing modify acls to: root
17/06/12 15:36:16 INFO SecurityManager: Changing view acls groups to:
17/06/12 15:36:16 INFO SecurityManager: Changing modify acls groups to:
17/06/12 15:36:16 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
17/06/12 15:36:16 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
17/06/12 15:36:16 INFO Master: Starting Spark master at spark://localhost.org.xxxxxxxxx.com:7077
17/06/12 15:36:16 INFO Master: Running Spark version 2.1.1
17/06/12 15:36:16 INFO Utils: Successfully started service 'MasterUI' on port 8080.
17/06/12 15:36:16 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://localhost:8080
17/06/12 15:36:16 INFO Utils: Successfully started service on port 6066.
17/06/12 15:36:16 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066
17/06/12 15:36:16 INFO Master: I have been elected leader! New state: ALIVE

我正在试图找出我无法启动工作节点的原因。有人能帮我解决这个问题吗?谢谢。

检查主机名是否正确解析。 如果您使用的是
localhost
,请确保它已在
/etc/hosts
文件中解析


让我知道这是否有帮助。干杯。

我实际上使用的是系统ip地址,而不是“localhost”(出于安全原因刚刚重命名)。是的,主机名已正确解决,问题仍然存在。不过,请尝试在
/etc/hosts
文件中解决
localhost
,然后查看它是否有效,如下所示:
192.168.y.xxx主机名
192.168.y.xxx localhost
。如果是这样,则您必须更改属性
spark\u LOCAL\u IP
的spark conf(spark env.sh)。