Amazon ec2 为什么Spark示例无法使用Spark-EC2脚本在EC2上提交Spark?

Amazon ec2 为什么Spark示例无法使用Spark-EC2脚本在EC2上提交Spark?,amazon-ec2,apache-spark,Amazon Ec2,Apache Spark,我下载了spark-1.5.2,并使用 之后,我转到examples/并运行mvn-package并将示例打包到一个jar中 最后,我使用以下命令运行提交: bin/spark-submit --class org.apache.spark.examples.JavaTC --master spark://url_here.eu-west-1.compute.amazonaws.com:7077 --deploy-mode cluster /home/aki/Projects/spark-1.5

我下载了spark-1.5.2,并使用

之后,我转到
examples/
并运行
mvn-package
并将示例打包到一个jar中

最后,我使用以下命令运行提交:

bin/spark-submit --class org.apache.spark.examples.JavaTC --master spark://url_here.eu-west-1.compute.amazonaws.com:7077 --deploy-mode cluster /home/aki/Projects/spark-1.5.2/examples/target/spark-examples_2.10-1.5.2.jar
我没有运行它,而是得到了错误:

WARN RestSubmissionClient: Unable to connect to server spark://url_here.eu-west-1.compute.amazonaws.com:7077.
Warning: Master endpoint spark://url_here.eu-west-1.compute.amazonaws.com:7077 was not a REST server. Falling back to legacy submission gateway instead.
15/12/22 17:36:07 WARN Utils: Your hostname, aki-linux resolves to a loopback address: 127.0.1.1; using 192.168.10.63 instead (on interface wlp4s0)
15/12/22 17:36:07 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/12/22 17:36:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.lookupTimeout
    at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcEnv.scala:214)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:229)
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:225)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcEnv.scala:242)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:98)
    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:116)
    at org.apache.spark.deploy.Client$$anonfun$7.apply(Client.scala:233)
    at org.apache.spark.deploy.Client$$anonfun$7.apply(Client.scala:233)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
    at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
    at org.apache.spark.deploy.Client$.main(Client.scala:233)
    at org.apache.spark.deploy.Client.main(Client.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [120 seconds]
    at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
    at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
    at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
    at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
    at scala.concurrent.Await$.result(package.scala:107)
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcEnv.scala:241)
    ... 21 more

您确定主控的URL包含“此处URL”吗

spark://url_here.eu-west-1.compute.amazonaws.com:7077

或者你是想在这篇文章中混淆它

如果可以,请在以下位置连接Spark UI:
或者,根据您的spark版本,确保您使用的是spark UI上显示的URL变量spark://...:7070 命令行参数

是否确定主控URL包含“URL here”

spark://url_here.eu-west-1.compute.amazonaws.com:7077

或者你是想在这篇文章中混淆它

如果可以,请在以下位置连接Spark UI:
或者,根据您的spark版本,确保您使用的是spark UI上显示的URL变量spark://...:7070 命令行参数

它肯定是从Spark UI粘贴的同一URL副本,我正在为这篇文章混淆。它肯定是从Spark UI粘贴的同一URL副本,我对这篇文章感到困惑。你能进入吗?你能
telnet url_here.eu-west-1.compute.amazonaws.com 7077
?你能提供一个独立主机欢迎页面的截图吗?主人的日志里有什么?@JacekLaskowski我最终放弃了,把jar复制到主人那里,用默认模式客户端运行。你还能重新创建这个问题吗?我想要一个解决方案,而不是一个解决办法。你能访问吗?你能
telnet url_here.eu-west-1.compute.amazonaws.com 7077
?你能提供一个独立主机欢迎页面的截图吗?主人的日志里有什么?@JacekLaskowski我最终放弃了,把jar复制到主人那里,用默认模式客户端运行。你还能重新创建这个问题吗?我想要一个解决方案,而不是一个解决办法。