启动hadoop守护进程时出错(ConnectionRejected和ExitCodeException)

启动hadoop守护进程时出错(ConnectionRejected和ExitCodeException),hadoop,Hadoop,我在启动hadoop守护进程时遇到一些错误。我已经在Solaris 10服务器上配置了Hadoop 2.7.1。当我执行start dfs.sh并使用jps检查时,它仅显示datanode和secondaryNamenode作为正在运行的进程,而namenode没有启动。它在启动namenode时给出了ExitCodeexception,但当我检查Datanode和SecondaryNamenode的日志时,它显示了以下错误: secondaryname节点日志: 2015-12-08 10:2

我在启动hadoop守护进程时遇到一些错误。我已经在Solaris 10服务器上配置了Hadoop 2.7.1。当我执行
start dfs.sh
并使用
jps
检查时,它仅显示datanode和secondaryNamenode作为正在运行的进程,而namenode没有启动。它在启动namenode时给出了ExitCodeexception,但当我检查Datanode和SecondaryNamenode的日志时,它显示了以下错误:

secondaryname节点日志:

2015-12-08 10:27:51,646 ERROR
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Exception in doCheckpoint
java.net.ConnectException: Call From psdrac2/192.168.106.109 to psdrac2:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
at org.apache.hadoop.ipc.Client.call(Client.java:1480)
at org.apache.hadoop.ipc.Client.call(Client.java:1407)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy10.getTransactionId(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.NamenodeProtocolTranslatorPB.getTransactionID(NamenodeProtocolTranslatorPB.java:128)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy11.getTransactionID(Unknown Source)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.countUncheckpointedTxns(SecondaryNameNode.java:641)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.shouldCheckpointBasedOnCount(SecondaryNameNode.java:649)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doWork(SecondaryNameNode.java:393)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$1.run(SecondaryNameNode.java:361)
at org.apache.hadoop.security.SecurityUtil.doAsLoginUserOrFatal(SecurityUtil.java:415)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.run(SecondaryNameNode.java:357)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.Net.connect0(Native Method)
at sun.nio.ch.Net.connect(Net.java:454)
at sun.nio.ch.Net.connect(Net.java:446)
at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:648)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:609)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:707)
at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:370)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1529)
at org.apache.hadoop.ipc.Client.call(Client.java:1446)
... 18 more
2015-12-08 10:28:52,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:53,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:54,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:55,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:56,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:57,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:58,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:59,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:29:00,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:29:01,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:13,703 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2015-12-08 10:26:16,710 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-12-08 10:26:18,576 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2015-12-08 10:26:19,086 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2015-12-08 10:26:19,088 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
2015-12-08 10:26:19,124 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576
2015-12-08 10:26:19,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is psdrac2
2015-12-08 10:26:19,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
2015-12-08 10:26:19,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
2015-12-08 10:26:19,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is 1048576 bytes/s
2015-12-08 10:26:19,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 5
2015-12-08 10:26:20,029 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
2015-12-08 10:26:20,087 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
2015-12-08 10:26:20,133 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined
2015-12-08 10:26:20,185 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2015-12-08 10:26:20,201 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
2015-12-08 10:26:20,202 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
2015-12-08 10:26:20,203 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
2015-12-08 10:26:20,310 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 40263
2015-12-08 10:26:20,310 INFO org.mortbay.log: jetty-6.1.26
2015-12-08 10:26:23,001 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:40263
2015-12-08 10:26:23,969 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:50075
2015-12-08 10:26:24,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hadoop
2015-12-08 10:26:24,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
2015-12-08 10:26:25,027 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
2015-12-08 10:26:25,147 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 50020
2015-12-08 10:26:25,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:50020
2015-12-08 10:26:25,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null
2015-12-08 10:26:25,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: <default>
2015-12-08 10:26:25,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to psdrac2/192.168.106.109:9000 starting to offer service
2015-12-08 10:26:25,663 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
2015-12-08 10:26:25,665 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 50020: starting
2015-12-08 10:26:27,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:28,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:29,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:30,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:31,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:32,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:33,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:34,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:35,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:36,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:36,964 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: psdrac2/192.168.106.109:9000
2015-12-08 10:26:42,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:43,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:45,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:46,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:47,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:48,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:49,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:50,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:51,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:52,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
数据节点日志:

2015-12-08 10:27:51,646 ERROR
org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Exception in doCheckpoint
java.net.ConnectException: Call From psdrac2/192.168.106.109 to psdrac2:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
at org.apache.hadoop.ipc.Client.call(Client.java:1480)
at org.apache.hadoop.ipc.Client.call(Client.java:1407)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy10.getTransactionId(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.NamenodeProtocolTranslatorPB.getTransactionID(NamenodeProtocolTranslatorPB.java:128)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy11.getTransactionID(Unknown Source)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.countUncheckpointedTxns(SecondaryNameNode.java:641)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.shouldCheckpointBasedOnCount(SecondaryNameNode.java:649)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doWork(SecondaryNameNode.java:393)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$1.run(SecondaryNameNode.java:361)
at org.apache.hadoop.security.SecurityUtil.doAsLoginUserOrFatal(SecurityUtil.java:415)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.run(SecondaryNameNode.java:357)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.Net.connect0(Native Method)
at sun.nio.ch.Net.connect(Net.java:454)
at sun.nio.ch.Net.connect(Net.java:446)
at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:648)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:609)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:707)
at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:370)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1529)
at org.apache.hadoop.ipc.Client.call(Client.java:1446)
... 18 more
2015-12-08 10:28:52,695 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:53,707 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:54,719 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:55,731 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:56,743 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:57,754 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:58,766 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:28:59,778 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:29:00,789 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:29:01,801 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:13,703 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2015-12-08 10:26:16,710 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-12-08 10:26:18,576 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2015-12-08 10:26:19,086 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2015-12-08 10:26:19,088 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
2015-12-08 10:26:19,124 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576
2015-12-08 10:26:19,135 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is psdrac2
2015-12-08 10:26:19,183 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
2015-12-08 10:26:19,359 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
2015-12-08 10:26:19,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is 1048576 bytes/s
2015-12-08 10:26:19,375 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 5
2015-12-08 10:26:20,029 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
2015-12-08 10:26:20,087 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
2015-12-08 10:26:20,133 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined
2015-12-08 10:26:20,185 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2015-12-08 10:26:20,201 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
2015-12-08 10:26:20,202 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
2015-12-08 10:26:20,203 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
2015-12-08 10:26:20,310 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 40263
2015-12-08 10:26:20,310 INFO org.mortbay.log: jetty-6.1.26
2015-12-08 10:26:23,001 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:40263
2015-12-08 10:26:23,969 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:50075
2015-12-08 10:26:24,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hadoop
2015-12-08 10:26:24,728 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
2015-12-08 10:26:25,027 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
2015-12-08 10:26:25,147 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 50020
2015-12-08 10:26:25,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:50020
2015-12-08 10:26:25,401 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null
2015-12-08 10:26:25,554 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: <default>
2015-12-08 10:26:25,625 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to psdrac2/192.168.106.109:9000 starting to offer service
2015-12-08 10:26:25,663 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
2015-12-08 10:26:25,665 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 50020: starting
2015-12-08 10:26:27,848 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:28,860 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:29,872 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:30,884 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:31,895 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:32,907 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:33,919 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:34,931 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:35,942 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:36,954 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:36,964 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Problem connecting to server: psdrac2/192.168.106.109:9000
2015-12-08 10:26:42,986 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:43,998 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:45,010 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:46,022 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:47,034 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:48,046 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:49,058 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:50,069 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:51,081 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:52,093 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: psdrac2/192.168.106.109:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2015-12-08 10:26:13703信息
org.apache.hadoop.hdfs.server.datanode.datanode:为[TERM,HUP,INT]注册的UNIX信号处理程序
2015-12-08 10:26:16710警告org.apache.hadoop.util.NativeCodeLoader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类
2015-12-08 10:26:18576 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:从hadoop-metrics2.properties加载的属性
2015-12-08 10:26:19086 INFO org.apache.hadoop.metrics2.impl.metricsystemimpl:计划的快照周期为10秒。
2015-12-08 10:26:19088 INFO org.apache.hadoop.metrics2.impl.MetricSystemImpl:DataNode metrics系统已启动
2015-12-08 10:26:19124 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner:使用targetBytesPerSec 1048576初始化块扫描程序
2015-12-08 10:26:19135 INFO org.apache.hadoop.hdfs.server.datanode.datanode:配置的主机名是psdrac2
2015-12-08 10:26:19183 INFO org.apache.hadoop.hdfs.server.datanode.datanode:使用maxLockedMemory=0启动datanode
2015-12-08 10:26:19359 INFO org.apache.hadoop.hdfs.server.datanode.datanode:Opened streaming server at/0.0.0:50010
2015-12-08 10:26:19375 INFO org.apache.hadoop.hdfs.server.datanode.datanode:平衡带宽为1048576字节/秒
2015-12-08 10:26:19375 INFO org.apache.hadoop.hdfs.server.datanode.datanode:用于平衡的线程数为5
2015-12-08 10:26:20029信息org.mortbay.log:通过org.mortbay.log.slf4jloggeradapter(org.mortbay.log)登录到org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log)
2015-12-08 10:26:20087 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter:无法初始化FileSignerSecretProvider,返回使用随机机密。
2015-12-08 10:26:20133 INFO org.apache.hadoop.http.HttpRequestLog:未定义http.requests.datanode的http请求日志
2015-12-08 10:26:20185 INFO org.apache.hadoop.http.HttpServer2:添加了全局筛选器“安全”(class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2015-12-08 10:26:20201 INFO org.apache.hadoop.http.HttpServer2:向上下文数据节点添加了过滤器static_user_过滤器(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)
2015-12-08 10:26:20202 INFO org.apache.hadoop.http.HttpServer2:将过滤器static_user_过滤器(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)添加到上下文static
2015-12-08 10:26:20203 INFO org.apache.hadoop.http.HttpServer2:在上下文日志中添加了过滤器static_user_过滤器(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)
2015-12-08 10:26:20310 INFO org.apache.hadoop.http.HttpServer2:Jetty绑定到端口40263
2015-12-08 10:26:20310信息org.mortbay.log:jetty-6.1.26
2015-12-08 10:26:23001 INFO org.mortbay.log:已启动HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:40263
2015-12-08 10:26:23969 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer:在/0.0.0:50075上侦听HTTP流量
2015-12-08 10:26:24727 INFO org.apache.hadoop.hdfs.server.datanode.datanode:dnUserName=hadoop
2015-12-08 10:26:24728 INFO org.apache.hadoop.hdfs.server.datanode.datanode:supergroup=supergroup
2015-12-08 10:26:25027 INFO org.apache.hadoop.ipc.CallQueueManager:使用callQueue类java.util.concurrent.LinkedBlockingQueue
2015-12-08 10:26:25147 INFO org.apache.hadoop.ipc.Server:启动端口50020的套接字读取器#1
2015-12-08 10:26:25322 INFO org.apache.hadoop.hdfs.server.datanode.datanode:在/0.0.0.0:50020打开IPC服务器
2015-12-08 10:26:25401 INFO org.apache.hadoop.hdfs.server.datanode.datanode:收到的名称服务刷新请求:null
2015-12-08 10:26:25554 INFO org.apache.hadoop.hdfs.server.datanode.datanode:为名称服务启动BPOfferServices:
2015-12-08 10:26:25625 INFO org.apache.hadoop.hdfs.server.datanode.datanode:psdrac2/192.168.106.109的块池(datanode Uuid未分配)服务开始提供服务
2015-12-08 10:26:25663 INFO org.apache.hadoop.ipc.Server:ipc服务器响应程序:启动
2015-12-08 10:26:25665 INFO org.apache.hadoop.ipc.Server:50020上的ipc服务器侦听器:正在启动
2015-12-08 10:26:27848 INFO org.apache.hadoop.ipc.Client:正在重试连接到服务器:psdrac2/192.168.106.109:9000。已尝试0次;重试策略是RetryUpToMaximumCountWithFixedSleep(maxRetries=10,sleepTime=1000毫秒)
2015-12-08 10:26:28860 INFO org.apache.hadoop.ipc.Client:正在重试连接到服务器:psdrac2/192.168.106.109:9000。已试过1次;重试策略是RetryUpToMaximumCountWithFixedSleep(maxRetries=10,sleepTime=1000毫秒)
2015-12-08 10:26:29872 INFO org.apache.hadoop.ipc.Client:正在重试连接到服务器:psdrac2/192.168.106.109:9000。已试过2次;重试策略是RetryUpToMaximumCountWithFixedSleep(maxRetries=10,sleepTime=1000毫秒)
2015-12-08 10:26:30884 INFO org.apache.hadoop.ipc.Client:正在重试连接到服务器:psdrac2/192.168.106.109:9000。已试过3次;重试策略是RetryUpToMaximumCountWithFixedSleep(maxRetries=10,sleepTime=1000毫秒)
2015-12-08 10:26:31895 INFO org.apache.hadoop.ipc.Client:正在重试连接到服务器:psdrac2/192.168.106.109:9000。已试过4次;重试策略是RetryUpToMaximumCountWithFixedSleep(maxRetries=10,sleepTime=1000毫秒)
2015-12-08 10:26:32907 INFO org.apache.hadoop.ipc.Client:重试