Java 在Windows上使用Hadoop 2.6.0提交作业时出错
我正在做一个使用Hadoop 0.20.1运行的Java项目,我正在尝试迁移到Hadoop 2.6.0。在项目中更改了相应的Hadoop jar文件后,在提交作业时会出现以下错误:Java 在Windows上使用Hadoop 2.6.0提交作业时出错,java,eclipse,hadoop,Java,Eclipse,Hadoop,我正在做一个使用Hadoop 0.20.1运行的Java项目,我正在尝试迁移到Hadoop 2.6.0。在项目中更改了相应的Hadoop jar文件后,在提交作业时会出现以下错误: Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z at org.apache.hadoo
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:557)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:187)
at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:174)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:108)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)
at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:163)
at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:536)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)
线程“main”java.lang.UnsatisfiedLinkError中的异常:org.apache.hadoop.io.nativeio.nativeio$Windows.access0(Ljava/lang/String;I)Z
位于org.apache.hadoop.io.nativeio.nativeio$Windows.access0(本机方法)
位于org.apache.hadoop.io.nativeio.nativeio$Windows.access(nativeio.java:557)
位于org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
位于org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:187)
位于org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:174)
位于org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:108)
位于org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
位于org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
位于org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
位于org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
位于org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
位于org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)
位于org.apache.hadoop.mapred.LocalJobRunner$Job.(LocalJobRunner.java:163)
位于org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)
位于org.apache.hadoop.mapreduce.jobsmitter.submitJobInternal(jobsmitter.java:536)
位于org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
位于org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)
位于java.security.AccessController.doPrivileged(本机方法)
位于javax.security.auth.Subject.doAs(Subject.java:422)
位于org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
位于org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)
我已经读过,这可能是一个与Hadoop二进制文件相关的问题,但我自己构建了它们,将它们放在“c:\Hadoop\bin”中,环境变量Hadoop\u HOME具有正确的值
我正在Eclipse上运行我的项目,在一台有Windows7 64位和Java8的机器上
有人能帮我吗
谢谢 我终于解决了我的问题。我安装了Java8 32位版本,而不是64位版本。我安装了正确的版本,Hadoop作业提交得非常完美。您使用的是独立模式、伪分布式模式还是连接到集群?通过运行hadoop示例检查您的安装。在独立模式之外的windows上运行hadoop不是一个好主意。我打算使用独立模式。我把它放在我的mapred-site.xml中,目的是:“mapreduce.framework.namelocal”。我将下载hadoop示例,以确保安装正常。