Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/383.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 从eclipse提交hadoop作业时发生ConnectException_Java_Eclipse_Hadoop_Mapreduce_Yarn - Fatal编程技术网

Java 从eclipse提交hadoop作业时发生ConnectException

Java 从eclipse提交hadoop作业时发生ConnectException,java,eclipse,hadoop,mapreduce,yarn,Java,Eclipse,Hadoop,Mapreduce,Yarn,我正试图从windows上的eclipse向hadoop-2.5.0(安装在运行在虚拟机上的ubuntu 14.04.1服务器上)提交一个作业(简单的字数计算)。在作业配置中,我将“fs.defaultFS”设置为hdfs://192.168.2.216:8020“(如本文所述)但当我运行主程序时,我得到以下异常: WARN - NativeCodeLoader - Unable to load native-hadoop library for your platform

我正试图从windows上的eclipse向hadoop-2.5.0(安装在运行在虚拟机上的ubuntu 14.04.1服务器上)提交一个作业(简单的字数计算)。在作业配置中,我将“fs.defaultFS”设置为hdfs://192.168.2.216:8020“(如本文所述)但当我运行主程序时,我得到以下异常:

WARN  - NativeCodeLoader           - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ERROR - Shell                      - Failed to locate the winutils binary in the hadoop binary path
Exception in thread "main" java.net.ConnectException: Call From EL-OUED/192.168.2.8 to 192.168.2.216:8020 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
    at org.apache.hadoop.ipc.Client.call(Client.java:1414)
    at org.apache.hadoop.ipc.Client.call(Client.java:1363)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
    at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
    at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
    at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
    at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
    at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
    at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145)
    at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
    at com.heavenize.hadoop.WordCountMR.main(WordCountMR.java:55)
Caused by: java.net.ConnectException: Connection refused: no further information
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:735)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
    at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
    at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
    at org.apache.hadoop.ipc.Client.call(Client.java:1381)
    ... 28 more
此外,在hadoop上检查连接配置时,它似乎正在侦听/接受127.0.0.1:8020上的连接

$netstat -lent | grep 8020
tcp        0      0 127.0.0.1:8020          0.0.0.0:*               LISTEN      1001       10380
这是core site.xml的内容,我想知道它是否是这个问题的根源,以及如何解决它

<configuration>
  <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost</value>
  </property>
</configuration>

fs.default.name
hdfs://localhost

基本上,您的namenode正在本地主机接口上侦听,因此它只允许从127.0.0.1进行连接。正如您所建议的,错误确实在fs.default.name参数中,应该修改该参数以使用主机名而不是localhost

注意/etc/hosts应该包含一行

192.168.2.216 hostname.fully.qualified.domain.com hostname

您可以通过运行命令“hostname”和“hostname-f”来验证主机名设置是否正确。“hostname”应返回gethostname返回的系统名称,而“hostname-f”应返回系统的fqdn。

thnx以获取解决连接问题的提示。