Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 纱线上的流动火花_Apache Spark_Yarn - Fatal编程技术网

Apache spark 纱线上的流动火花

Apache spark 纱线上的流动火花,apache-spark,yarn,Apache Spark,Yarn,我正在尝试在quickstart cloudera vm中运行spark on Thread。它已经安装了spark 1.3和Hadoop 2.6.0-cdh5.4.0。(我没有使用spark submit,因为我想运行不同版本的spark) 我能够在纱线上运行spark 1.3,但得到spark 1.4的以下错误 日志显示其在spark 1.4上运行,但仍然给出了1.4而非1.3中存在的方法错误。甚至fat jar也包含1.4的类文件 至于在纱线中运行,安装的spark版本应该无关紧要,但它仍

我正在尝试在quickstart cloudera vm中运行spark on Thread。它已经安装了spark 1.3和Hadoop 2.6.0-cdh5.4.0。(我没有使用spark submit,因为我想运行不同版本的spark)

我能够在纱线上运行spark 1.3,但得到spark 1.4的以下错误

日志显示其在spark 1.4上运行,但仍然给出了1.4而非1.3中存在的方法错误。甚至fat jar也包含1.4的类文件

至于在纱线中运行,安装的spark版本应该无关紧要,但它仍在其他版本上运行

Hadoop版本:

Hadoop 2.6.0-cdh5.4.0
Subversion http://github.com/cloudera/hadoop -r c788a14a5de9ecd968d1e2666e8765c5f018c271
Compiled by jenkins on 2015-04-21T19:18Z
Compiled with protoc 2.5.0
From source with checksum cd78f139c66c13ab5cee96e15a629025
This command was run using /usr/lib/hadoop/hadoop-common-2.6.0-cdh5.4.0.jar
LogType:stderr
Log Upload Time:Tue Oct 20 21:58:56 -0700 2015
LogLength:2334
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-        1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/10/simple-yarn-app-1.1.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/10/20 21:58:50 INFO spark.SparkContext: Running Spark version 1.4.0
15/10/20 21:58:53 INFO spark.SecurityManager: Changing view acls to: yarn
15/10/20 21:58:53 INFO spark.SecurityManager: Changing modify acls to: yarn
15/10/20 21:58:53 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn); users with modify permissions: Set(yarn)
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.network.util.JavaUtils.timeStringAsSec(Ljava/lang/String;)J
at org.apache.spark.util.Utils$.timeStringAsSeconds(Utils.scala:1027)
at org.apache.spark.SparkConf.getTimeAsSeconds(SparkConf.scala:194)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:68)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
at com.hortonworks.simpleyarnapp.HelloWorld.main(HelloWorld.java:50)
15/10/20 21:58:53 INFO util.Utils: Shutdown hook called
错误:

Hadoop 2.6.0-cdh5.4.0
Subversion http://github.com/cloudera/hadoop -r c788a14a5de9ecd968d1e2666e8765c5f018c271
Compiled by jenkins on 2015-04-21T19:18Z
Compiled with protoc 2.5.0
From source with checksum cd78f139c66c13ab5cee96e15a629025
This command was run using /usr/lib/hadoop/hadoop-common-2.6.0-cdh5.4.0.jar
LogType:stderr
Log Upload Time:Tue Oct 20 21:58:56 -0700 2015
LogLength:2334
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-        1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/filecache/10/simple-yarn-app-1.1.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/10/20 21:58:50 INFO spark.SparkContext: Running Spark version 1.4.0
15/10/20 21:58:53 INFO spark.SecurityManager: Changing view acls to: yarn
15/10/20 21:58:53 INFO spark.SecurityManager: Changing modify acls to: yarn
15/10/20 21:58:53 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn); users with modify permissions: Set(yarn)
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.network.util.JavaUtils.timeStringAsSec(Ljava/lang/String;)J
at org.apache.spark.util.Utils$.timeStringAsSeconds(Utils.scala:1027)
at org.apache.spark.SparkConf.getTimeAsSeconds(SparkConf.scala:194)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:68)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
at com.hortonworks.simpleyarnapp.HelloWorld.main(HelloWorld.java:50)
15/10/20 21:58:53 INFO util.Utils: Shutdown hook called
LogType:stderr
日志上传时间:2015年10月20日星期二21:58:56-0700
对数长度:2334
日志内容:
SLF4J:类路径包含多个SLF4J绑定。
SLF4J:在[jar:file:/usr/lib/zookeeper/lib/SLF4J-log4j12-1.7.5.jar!/org/SLF4J/impl/StaticLoggerBinder.class]中找到绑定
SLF4J:在[jar:file:/var/lib/hadoop-warn/cache/warn/nm-local-dir/filecache/10/simple-warn-app-1.1.0.jar!/org/SLF4J/impl/StaticLoggerBinder.class]中找到绑定
SLF4J:参见http://www.slf4j.org/codes.html#multiple_bindings 我需要一个解释。
SLF4J:实际绑定的类型为[org.SLF4J.impl.Log4jLoggerFactory]
20年10月15日21:58:50信息spark.SparkContext:运行spark版本1.4.0
15/10/20 21:58:53信息spark.SecurityManager:将视图ACL更改为:纱线
15/10/20 21:58:53信息spark.SecurityManager:将修改ACL更改为:纱线
15/10/20 21:58:53 INFO spark.SecurityManager:SecurityManager:身份验证已禁用;ui ACL被禁用;具有查看权限的用户:设置(纱线);具有修改权限的用户:设置(纱线)
线程“main”java.lang.NoSuchMethodError中出现异常:org.apache.spark.network.util.JavaUtils.timeStringAsSec(Ljava/lang/String;)J
在org.apache.spark.util.Utils$.timeStringAsSeconds(Utils.scala:1027)上
位于org.apache.spark.SparkConf.getTimeAsSeconds(SparkConf.scala:194)
位于org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:68)
位于org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
位于org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
在org.apache.spark.util.Utils$$anonfun$startServicePort$1.apply$mcVI$sp(Utils.scala:1991)
位于scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
在org.apache.spark.util.Utils$.startServicePort(Utils.scala:1982)上
位于org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
在org.apache.spark.rpc.akka.akkarpcevFactory.create上(akkarpcev.scala:245)
位于org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
位于org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
在org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
位于org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
位于org.apache.spark.SparkContext(SparkContext.scala:424)
位于org.apache.spark.api.java.JavaSparkContext(JavaSparkContext.scala:61)
位于com.hortonworks.simpleyarnapp.HelloWorld.main(HelloWorld.java:50)
15/10/20 21:58:53信息util.Utils:调用了关闭挂钩

请提供帮助

您是否在HDFS中为您的Spark版本部署了Spark的assembly jar?不,基本上我遵循以下代码,并将应用程序主控程序更改为运行Spark应用程序类。在这种方法中,火花罐被捆绑到脂肪罐中并输出到纱线中。您现在能在不同版本的纱线上运行火花吗?如果没有,我可以帮你。我在cloudera集群中运行Spark 1.5.2 on Thread。大多数时候,当您遇到java.lang.NoSuchMethodError时,这是因为类路径冲突。Java类加载器考虑类在类路径中的第一个外观,并忽略后面的外观。因此,即使您已将1.4包含在您的胖罐中,它也可能在spark 1.3之后出现,因此将被忽略。您能尝试将spark.driver.userClassPathFirst设置为true吗?我建议您根本不要将spark包含在您的胖罐子中。如果您使用的是maven,只需通过将范围设置为“compile”来排除spark。我以前经历过依赖地狱,相信我,你想避免这种情况。