清理临时区域错误:Hadoop

清理临时区域错误:Hadoop,hadoop,mapreduce,Hadoop,Mapreduce,我正在hadoop上运行map reduce代码,出现以下错误: INFO mapreduce.JobSubmitter:清理暂存区域文件:/home/dataflair/hdata/mapred/staging/intern124288574/.staging/job\u local124288574\u 0001 线程“main”ExitCodeException exitCode=1中出现异常:chmod:无法访问“/home/dataflair/hdata/mapred/staging/

我正在hadoop上运行map reduce代码,出现以下错误:

INFO mapreduce.JobSubmitter:清理暂存区域文件:/home/dataflair/hdata/mapred/staging/intern124288574/.staging/job\u local124288574\u 0001 线程“main”ExitCodeException exitCode=1中出现异常:chmod:无法访问“/home/dataflair/hdata/mapred/staging/intern124288574/.staging/job_local124288574_0001”:没有这样的文件或目录


您试图访问的文件不允许您访问权限。要解决这个问题,你必须自己显式地更改文件权限。但它试图访问一个直接位于主目录下的文件,我想这应该是一个用户目录。而且,也没有这样的目录已经存在,如果是这样的话,那么你的应用程序正在查找当前不存在的目录。尝试自己创建一个目录,并为该目录提供适当的权限。有时,这是一个解决办法,但它的搜索目录直接在主目录下。我甚至可以在那里建立一个目录吗?我的意思是在两者之间应该有我的用户名。像home/{usename}/dataflair/…对我来说,将这个属性添加到warn-site.xml中是有效的。。warn.nodemanager.local-dirs/home/{username}/
at org.apache.hadoop.util.Shell.runCommand(Shell.java:998)
at org.apache.hadoop.util.Shell.run(Shell.java:884)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1216)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:1310)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:1292)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:767)
at org.apache.hadoop.fs.ChecksumFileSystem$1.apply(ChecksumFileSystem.java:506)
at org.apache.hadoop.fs.ChecksumFileSystem$FsOperation.run(ChecksumFileSystem.java:487)
at org.apache.hadoop.fs.ChecksumFileSystem.setPermission(ChecksumFileSystem.java:503)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:720)
at org.apache.hadoop.mapreduce.JobResourceUploader.mkdirs(JobResourceUploader.java:648)
at org.apache.hadoop.mapreduce.JobResourceUploader.uploadResourcesInternal(JobResourceUploader.java:167)
at org.apache.hadoop.mapreduce.JobResourceUploader.uploadResources(JobResourceUploader.java:128)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:101)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1886)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
at wordCount.WordCount.main(WordCount.java:66)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:239)
at org.apache.hadoop.util.RunJar.main(RunJar.java:153)