Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/macos/10.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
hadoop java.io.IOException:在OSX上运行namenode-format时_Java_Macos_Hadoop_Java Io - Fatal编程技术网

hadoop java.io.IOException:在OSX上运行namenode-format时

hadoop java.io.IOException:在OSX上运行namenode-format时,java,macos,hadoop,java-io,Java,Macos,Hadoop,Java Io,我在格式化namenode时遇到以下错误,我曾尝试使用其他一些堆栈溢出解决方案中提到的sudo su,但我仍然遇到此错误,请协助 14/01/16 16:10:41 INFO util.GSet: Computing capacity for map INodeMap 14/01/16 16:10:41 INFO util.GSet: VM type = 64-bit 14/01/16 16:10:41 INFO util.GSet: 1.0% max memory = 889 MB

我在格式化namenode时遇到以下错误,我曾尝试使用其他一些堆栈溢出解决方案中提到的sudo su,但我仍然遇到此错误,请协助

14/01/16 16:10:41 INFO util.GSet: Computing capacity for map INodeMap
14/01/16 16:10:41 INFO util.GSet: VM type       = 64-bit
14/01/16 16:10:41 INFO util.GSet: 1.0% max memory = 889 MB
14/01/16 16:10:41 INFO util.GSet: capacity      = 2^20 = 1048576 entries
14/01/16 16:10:41 INFO namenode.NameNode: Caching file names occuring more than 10 times
14/01/16 16:10:41 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
14/01/16 16:10:41 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
14/01/16 16:10:41 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
14/01/16 16:10:41 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
14/01/16 16:10:41 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
14/01/16 16:10:41 INFO util.GSet: Computing capacity for map Namenode Retry Cache
14/01/16 16:10:41 INFO util.GSet: VM type       = 64-bit
14/01/16 16:10:41 INFO util.GSet: 0.029999999329447746% max memory = 889 MB
14/01/16 16:10:41 INFO util.GSet: capacity      = 2^15 = 32768 entries
14/01/16 16:10:41 FATAL namenode.NameNode: Exception in namenode join
java.io.IOException: Cannot create directory /Users/hadoop/hadoop/bin/hdfs/namenode/current
    at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:301)
    at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:523)
    at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:544)
    at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:147)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:837)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1213)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1320)
14/01/16 16:10:41 INFO util.ExitUtil: Exiting with status 1
14/01/16 16:10:41 INFO namenode.NameNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode 

我刚刚尝试创建目录
/Users/hadoop/hadoop/bin/hdfs/namenode/current
,但我发现bin目录中有名为
hdfs
的文件,因此它不允许我在bin目录中创建名为hdfs的目录,因此不确定我是否应该过度写入此
hdfs
文件或将其复制到其他位置/Users/hadoop/hadoop/bin/hdfs/namenode/current不存在,无法在那里创建文件,因此请确保先创建目录

确保您已经读取并执行了对所有子目录的访问:
chmod o+x/Users/hadoop/hadoop/bin/hdfs/namenode/current


检查hadoop是否以其他用户名运行。要找到这一点,您可以从命令行使用以下命令:
ps aux | grep hadoop

我想我刚刚发现了问题,但我仍然没有解决方案。让我编辑帖子并指出问题所在