Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/spring/13.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Hadoop从Spring批处理管理员启动作业时连接被拒绝异常_Java_Spring_Hadoop_Mapreduce_Spring Data - Fatal编程技术网

Java Hadoop从Spring批处理管理员启动作业时连接被拒绝异常

Java Hadoop从Spring批处理管理员启动作业时连接被拒绝异常,java,spring,hadoop,mapreduce,spring-data,Java,Spring,Hadoop,Mapreduce,Spring Data,我试图从spring管理员处触发hadoop mapreduce作业,但出现以下错误。Spring管理员正在wasce上运行 作业配置: <hdp:configuration> fs.defaultFS=hdfs://localhost:8020 mapred.job.tracker=localhost:8021 </hdp:configuration> <hdp:job id="mr-my-job" input-path="/d

我试图从spring管理员处触发hadoop mapreduce作业,但出现以下错误。Spring管理员正在wasce上运行

作业配置:

<hdp:configuration>
    fs.defaultFS=hdfs://localhost:8020
    mapred.job.tracker=localhost:8021
</hdp:configuration>

<hdp:job id="mr-my-job" 
        input-path="/data/input/"
        output-path="/data/output/"
        jar-by-class="org.test.Main" 
        mapper="org.test.Test1$Map"
        combiner="org.test.Test1$Combiner"
        reducer="org.test.Test1$ReduceFromCombiner" />

fs.defaultFS=hdfs://localhost:8020
mapred.job.tracker=localhost:8021
请给我一些建议。当我通过打包一个jar运行这个作业并使用hadoop命令运行时,它工作正常,但在通过Admin运行时会出现一个错误。 sudo-u hdfs hadoop jar test.jar org.Main

Caused by: java.net.ConnectException: Call From <server>/<server_ip> to localhost:8021 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
        at org.apache.hadoop.ipc.Client.call(Client.java:1413)
        at org.apache.hadoop.ipc.Client.call(Client.java:1362)
        at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:231)
        at org.apache.hadoop.mapred.$Proxy203.getStagingAreaDir(Unknown Source)
        at org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1340)
        at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:954)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:948)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:948)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:582)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:612)
        at org.springframework.data.hadoop.mapreduce.JobExecutor$2.run(JobExecutor.java:199)
        ... 24 more
Caused by: java.net.ConnectException: Connection refused
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
        at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
        at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1461)
        at org.apache.hadoop.ipc.Client.call(Client.java:1380)
原因:java.net.ConnectException:从/到localhost的调用:8021在连接异常上失败:java.net.ConnectException:连接被拒绝;有关更多详细信息,请参阅:http://wiki.apache.org/hadoop/ConnectionRefused
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:526)
位于org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
位于org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
位于org.apache.hadoop.ipc.Client.call(Client.java:1413)
位于org.apache.hadoop.ipc.Client.call(Client.java:1362)
在org.apache.hadoop.ipc.writeablerpcengine$Invoker.invoke上(writeablerpcengine.java:231)
位于org.apache.hadoop.mapred.$Proxy203.getStagingReadir(未知来源)
位于org.apache.hadoop.mapred.JobClient.getStagingReadir(JobClient.java:1340)
位于org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
位于org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:954)
位于org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:948)
位于java.security.AccessController.doPrivileged(本机方法)
位于javax.security.auth.Subject.doAs(Subject.java:415)
位于org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
位于org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:948)
位于org.apache.hadoop.mapreduce.Job.submit(Job.java:582)
位于org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:612)
位于org.springframework.data.hadoop.mapreduce.jobecutor$2.run(jobecutor.java:199)
... 还有24个
原因:java.net.ConnectException:连接被拒绝
在sun.nio.ch.socketchannel.checkConnect(本机方法)
位于sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
位于org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
位于org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
位于org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
位于org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
位于org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
位于org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
位于org.apache.hadoop.ipc.Client.getConnection(Client.java:1461)
位于org.apache.hadoop.ipc.Client.call(Client.java:1380)

我们有一个关于hadoop的专门页面与此相关。请按照您看到的堆栈跟踪中的链接进行操作:

我想说的是,我们正在努力改进连接诊断,包括目标主机和端口的报告、带有调试建议的URL列表,以及这些页面的编写和维护。当人们不关注链接时,我总是感到很难过,因为我觉得我的努力已经白费了。请,当您在堆栈跟踪中看到这样的URL时,请至少粗略地看一下引用的URL