Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop Spark 1.2无法连接到HDP 2.2上的HDFS_Hadoop_Apache Spark_Hadoop2_Hortonworks Data Platform_Ambari - Fatal编程技术网

Hadoop Spark 1.2无法连接到HDP 2.2上的HDFS

Hadoop Spark 1.2无法连接到HDP 2.2上的HDFS,hadoop,apache-spark,hadoop2,hortonworks-data-platform,ambari,Hadoop,Apache Spark,Hadoop2,Hortonworks Data Platform,Ambari,我跟随本教程在HDP2.2上安装Spark 但它告诉我dfs拒绝了我的联系! 我的命令是: ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster --num-executors 3 --driver-memory 512m --executor-memory 512m --executor-cores 1 lib/spark-examples*.jar 10 以下是日志: tp

我跟随本教程在HDP2.2上安装Spark

但它告诉我dfs拒绝了我的联系! 我的命令是:

./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster --num-executors 3 --driver-memory 512m  --executor-memory 512m  --executor-cores 1  lib/spark-examples*.jar 10
以下是日志:

tput: No value for $TERM and no -T specified
Spark assembly has been built with Hive, including Datanucleus jars on classpath
15/02/04 13:52:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/02/04 13:52:52 INFO impl.TimelineClientImpl: Timeline service address: http://amb7.a.b.c:8188/ws/v1/timeline/
15/02/04 13:52:53 INFO client.RMProxy: Connecting to ResourceManager at amb7.a.b.c/172.0.22.8:8050
15/02/04 13:52:53 INFO yarn.Client: Requesting a new application from cluster with 4 NodeManagers
15/02/04 13:52:53 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (2048 MB per container)
15/02/04 13:52:53 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
15/02/04 13:52:53 INFO yarn.Client: Setting up container launch context for our AM
15/02/04 13:52:53 INFO yarn.Client: Preparing resources for our AM container
15/02/04 13:52:54 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
15/02/04 13:52:54 INFO yarn.Client: Uploading resource file:/tmp/spark-1.2.0.2.2.0.0-82-bin-2.6.0.2.2.0.0-2041/lib/spark-assembly-1.2.0.2.2.0.0-82-hadoop2.6.0.2.2.0.0-2041.jar -> hdfs://amb1.a.b.c:8020/user/hdfs/.sparkStaging/application_1423073070725_0007/spark-assembly-1.2.0.2.2.0.0-82-hadoop2.6.0.2.2.0.0-2041.jar
15/02/04 13:52:54 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
    at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1611)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1362)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:589)
15/02/04 13:52:54 INFO hdfs.DFSClient: Abandoning BP-470883394-172.0.91.7-1423072968591:blk_1073741885_1061
15/02/04 13:52:54 INFO hdfs.DFSClient: Excluding datanode 172.0.11.0:50010
15/02/04 13:52:55 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
    at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1611)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1362)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:589)
15/02/04 13:52:55 INFO hdfs.DFSClient: Abandoning BP-470883394-172.0.91.7-1423072968591:blk_1073741886_1062
15/02/04 13:52:55 INFO hdfs.DFSClient: Excluding datanode 172.0.81.0:50010
15/02/04 13:52:55 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
    at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1611)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1362)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:589)
15/02/04 13:52:55 INFO hdfs.DFSClient: Abandoning BP-470883394-172.0.91.7-1423072968591:blk_1073741887_1063
15/02/04 13:52:55 INFO hdfs.DFSClient: Excluding datanode 172.0.7.0:50010
15/02/04 13:52:55 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
    at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1611)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1362)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:589)
15/02/04 13:52:55 INFO hdfs.DFSClient: Abandoning BP-470883394-172.0.91.7-1423072968591:blk_1073741888_1064
15/02/04 13:52:55 INFO hdfs.DFSClient: Excluding datanode 172.0.65.0:50010
15/02/04 13:52:55 WARN hdfs.DFSClient: DataStreamer Exception
java.io.IOException: Unable to create new block.
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1375)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:589)
15/02/04 13:52:55 WARN hdfs.DFSClient: Could not get block locations. Source file "/user/hdfs/.sparkStaging/application_1423073070725_0007/spark-assembly-1.2.0.2.2.0.0-82-hadoop2.6.0.2.2.0.0-2041.jar" - Aborting...
Exception in thread "main" java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
    at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1611)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1362)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:589)
有人这么说,但我不明白

我的环境是

HDP: 2.2
Ambari: 1.7
Spark 1.2

您必须修改管理调度程序的参数,我会根据集群中的可用内存容量增加该参数。 在纱线设置中:

warn.scheduler.maximun-allocation-mb=3072(或更大的数量)