Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/343.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 不能';无法从本地计算机连接hdfs_Java_Hadoop_Hdfs - Fatal编程技术网

Java 不能';无法从本地计算机连接hdfs

Java 不能';无法从本地计算机连接hdfs,java,hadoop,hdfs,Java,Hadoop,Hdfs,我正在编写一个简单的程序,从HDFS读取/写入数据。我无法从本地计算机连接安装在远程计算机中的hdfs。我得到了以下异常 18/08/19 16:47:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @

我正在编写一个简单的程序,从HDFS读取/写入数据。我无法从本地计算机连接安装在远程计算机中的hdfs。我得到了以下异常

18/08/19 16:47:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of successful kerberos logins and latency (milliseconds)], valueName=Time)
18/08/19 16:47:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of failed kerberos logins and latency (milliseconds)], valueName=Time)
18/08/19 16:47:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[GetGroups], valueName=Time)
18/08/19 16:47:45 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
18/08/19 16:47:45 DEBUG security.Groups:  Creating new Groups object
18/08/19 16:47:45 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
18/08/19 16:47:45 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
18/08/19 16:47:45 DEBUG util.NativeCodeLoader: java.library.path=/Users/rabbit/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
18/08/19 16:47:45 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/08/19 16:47:45 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
18/08/19 16:47:45 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
18/08/19 16:47:45 DEBUG util.Shell: Failed to detect a valid hadoop home directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
    at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:302)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:327)
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
    at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
    at org.apache.hadoop.security.Groups.<init>(Groups.java:77)
    at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:257)
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:234)
    at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:749)
    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:734)
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:607)
    at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2748)
    at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2740)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2606)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
    at com.rabbit.hdfs.io.ReadFileDataToConsole.main(ReadFileDataToConsole.java:22)
18/08/19 16:47:45 DEBUG util.Shell: setsid is not available on this machine. So not using it.
18/08/19 16:47:45 DEBUG util.Shell: setsid exited with exit code 0
18/08/19 16:47:45 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
18/08/19 16:47:45 DEBUG security.UserGroupInformation: hadoop login
18/08/19 16:47:45 DEBUG security.UserGroupInformation: hadoop login commit
18/08/19 16:47:45 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: rabbit
18/08/19 16:47:45 DEBUG security.UserGroupInformation: UGI loginUser:rabbit (auth:SIMPLE)
18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path = 
18/08/19 16:47:46 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
18/08/19 16:47:46 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@12405818
18/08/19 16:47:46 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@7ff2a664
18/08/19 16:47:46 DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.
18/08/19 16:47:46 DEBUG ipc.Client: The ping interval is 60000 ms.
18/08/19 16:47:46 DEBUG ipc.Client: Connecting to /192.168.143.150:54310
18/08/19 16:47:46 DEBUG ipc.Client: closing ipc connection to 192.168.143.150/192.168.143.150:54310: Connection refused
java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
    at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:606)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:700)
    at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1463)
    at org.apache.hadoop.ipc.Client.call(Client.java:1382)
    at org.apache.hadoop.ipc.Client.call(Client.java:1364)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:225)
    at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1165)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1155)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1145)
    at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:268)
    at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:235)
    at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:228)
    at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1318)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:293)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:289)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:289)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
    at com.rabbit.hdfs.io.ReadFileDataToConsole.main(ReadFileDataToConsole.java:29)
18/08/19 16:47:46 DEBUG ipc.Client: IPC Client (775931202) connection to /192.168.143.150:54310 from rabbit: closed
18/08/19 16:47:46 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@7ff2a664
18/08/19 16:47:46 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@7ff2a664
18/08/19 16:47:46 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@7ff2a664
18/08/19 16:47:46 DEBUG ipc.Client: Stopping client
Exception in thread "main" java.net.ConnectException: Call From rabbit/127.0.0.1 to 192.168.143.150:54310 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
    at org.apache.hadoop.ipc.Client.call(Client.java:1415)
    at org.apache.hadoop.ipc.Client.call(Client.java:1364)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:225)
    at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1165)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1155)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1145)
    at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:268)
    at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:235)
    at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:228)
    at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1318)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:293)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:289)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:289)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
    at com.rabbit.hdfs.io.ReadFileDataToConsole.main(ReadFileDataToConsole.java:29)
Caused by: java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
    at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:606)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:700)
    at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1463)
    at org.apache.hadoop.ipc.Client.call(Client.java:1382)
    ... 24 more
18/08/19 16:47:45调试lib.MutableMetricsFactory:field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginsAccess with annotation@org.apache.hadoop.metrics2.annotation.metrics(sampleName=Ops,always=false,about=,type=DEFAULT,value=[kerberos登录成功率和延迟(毫秒)],valueName=时间)
18/08/19 16:47:45 DEBUG lib.MutableMetricsFactory:field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation@org.apache.hadoop.metrics2.annotation.metrics(sampleName=Ops,always=false,about=,type=DEFAULT,value=[kerberos登录失败的速率和延迟(毫秒))],valueName=时间)
18/08/19 16:47:45调试lib.MutableMetricsFactory:field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation@org.apache.hadoop.metrics2.annotation.metrics(sampleName=Ops,always=false,about=,type=DEFAULT,value=[getGroups],valueName=Time)
18/08/19 16:47:45调试impl.MetricsSystemImpl:UgiMetrics、用户和组相关指标
18/08/19 16:47:45调试安全性。组:创建新组对象
18/08/19 16:47:45调试util.NativeCodeLoader:正在尝试加载自定义的本机hadoop库。。。
18/08/19 16:47:45调试util.NativeCodeLoader:未能加载本机hadoop,错误为:java.lang.UnsatisfiedLink错误:java.library.path中没有hadoop
18/08/19 16:47:45调试util.NativeCodeLoader:java.library.path=/Users/rabbit/library/java/Extensions:/library/java/Extensions:/Network/library/java/Extensions:/System/library/java/Extensions:/usr/lib/java:。
18/08/19 16:47:45警告util.NativeCodeLoader:无法为您的平台加载本机hadoop库…在适用的情况下使用内置java类
18/08/19 16:47:45调试安全性。JniBasedUnixGroupsMappingWithFallback:返回到基于shell的
18/08/19 16:47:45 DEBUG security.JniBasedUnixGroupsMappingWithFallback:组映射impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
18/08/19 16:47:45调试util.Shell:未能检测到有效的hadoop主目录
java.io.IOException:未设置HADOOP_HOME或HADOOP.HOME.dir。
位于org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:302)
位于org.apache.hadoop.util.Shell(Shell.java:327)
位于org.apache.hadoop.util.StringUtils。(StringUtils.java:78)
位于org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
位于org.apache.hadoop.security.Groups.(Groups.java:77)
位于org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
位于org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:257)
位于org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:234)
位于org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:749)
位于org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:734)
位于org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:607)
位于org.apache.hadoop.fs.FileSystem$Cache$Key.(FileSystem.java:2748)
位于org.apache.hadoop.fs.FileSystem$Cache$Key.(FileSystem.java:2740)
位于org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2606)
位于org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
位于com.rabbit.hdfs.io.ReadFileDataToConsole.main(ReadFileDataToConsole.java:22)
18/08/19 16:47:45调试util.Shell:setsid在此计算机上不可用。因此不使用它。
18/08/19 16:47:45调试工具Shell:setsid已退出,退出代码为0
18/08/19 16:47:45调试安全性。组:组映射impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;缓存超时=300000;警告Deltams=5000
18/08/19 16:47:45调试安全性。用户组信息:hadoop登录
18/08/19 16:47:45调试安全性。用户组信息:hadoop登录提交
18/08/19 16:47:45调试安全性。用户组信息:使用本地用户:UnixPrincipal:rabbit
18/08/19 16:47:45调试安全性。用户组信息:UGI登录用户:兔子(身份验证:简单)
18/08/19 16:47:46调试hdfs.blockreader本地:dfs.client.use.legacy.blockreader.local=false
18/08/19 16:47:46调试hdfs.blockreader本地:dfs.client.read.shortcircuit=false
18/08/19 16:47:46调试hdfs.blockreader本地:dfs.client.domain.socket.data.traffic=false
18/08/19 16:47:46调试hdfs.blockreader本地:dfs.domain.socket.path=
18/08/19 16:47:46调试重试。RetryUtils:multipleLinearRandomRetry=null
18/08/19 16:47:46调试ipc.Server:rpcKind=RPC_协议_缓冲区,rpcrequestwrappersclass=class org.apache.hadoop.ipc.protobufrpceengine$RpcRequestWrapper,rpcInvoker=org.apache.hadoop.ipc.protobufrpceengine$Server$ProtoBufRpcInvoker@12405818
18/08/19 16:47:46调试ipc.Client:将客户端从缓存中取出:org.apache.hadoop.ipc。Client@7ff2a664
18/08/19 16:47:46调试shortcircuit.DomainSocketFactory:短路本地读取和UNIX域套接字都已禁用。
18/08/19 16:47:46调试ipc。客户端:ping间隔为60000毫秒。
18/08/19 16:47:46调试ipc。客户端:连接到/192.168.143.150:54310
18/08/19 16:47:46调试ipc。客户端:关闭ipc到192.168.143.150/192.168.143.150的连接:54310:连接被拒绝
java.net.ConnectException:连接被拒绝
在sun.nio.ch.socketchannel.checkConnect(本机方法)
位于sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
位于org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
位于org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
位于org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
在org.apache.hadoop.ipc.Client$Connection.set上
192.168.143.150 192.168.143.150


127.0.0.1 localhost