Java 尝试运行HBase映射还原时出错
我真的很难用Hadoop运行Hbase MapReduce 我确实使用Hadoop Hortonwork 2版本。我使用的HBase版本是0.96.1-hadoop2。现在,当我尝试像这样运行MapReduce时:Java 尝试运行HBase映射还原时出错,java,hadoop,mapreduce,hbase,Java,Hadoop,Mapreduce,Hbase,我真的很难用Hadoop运行Hbase MapReduce 我确实使用Hadoop Hortonwork 2版本。我使用的HBase版本是0.96.1-hadoop2。现在,当我尝试像这样运行MapReduce时: hadoop jar target/invoice-aggregation-0.1.jar start="2014-02-01 01:00:00" end="2014-02-19 01:00:00" firstAccountId=0 lastAccountId=10 Hadoop
hadoop jar target/invoice-aggregation-0.1.jar start="2014-02-01 01:00:00" end="2014-02-19 01:00:00" firstAccountId=0 lastAccountId=10
Hadoop告诉我在其文件系统中找不到invoice-aggregation-0.1.jar?!我想知道为什么它需要在那里
这是我得到的错误
14/02/05 10:31:48 ERROR security.UserGroupInformation: PriviledgedActionException as:adio (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: hdfs://localhost:8020/home/adio/workspace/projects/invoice-aggregation/target/invoice-aggregation-0.1.jar
java.io.FileNotFoundException: File does not exist: hdfs://localhost:8020/home/adio/workspace/projects/invoice-aggregation/target/invoice-aggregation-0.1.jar
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1110)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:264)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:300)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:387)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
at com.company.invoice.MapReduceStarter.main(MapReduceStarter.java:244)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
如果您能给我任何建议、帮助,甚至我猜我为什么会出现这个错误,我将不胜感激。这个错误是由于hadoop找不到合适的罐子造成的 放置罐子并重新运行作业。
这将解决问题。好的,即使我不确定这是否是我通过向HDFS添加我的应用程序jar和所有缺少的jar来解决问题的最佳解决方案。使用Hadoop fs-copyFromLocal“myjarslocation”,其中需要使用“jars”。所以,每当MepReduce抛出异常告诉您HDFS上的某个位置缺少某个jar时,请将jar添加到该位置。这就是我为解决我的问题所做的。如果有人有更好的方法,我会很高兴听到它 在
hadoopjar…
命令的“-libjars”命令行选项中包含JAR
或者检查其他替代方法在我的例子中,通过将mapred-site.xml复制到HADOOP_CONF_DIR目录来修复错误您不应该从jar中指定要使用的类吗?这不是你得到的错误,但我注意到你没有。是的,我知道,但上次我运行jar时,它认为类是一个参数,而不是类,我真的不知道为什么!hdfs中是否存在此路径?hdfs://localhost:8020/home/adio/workspace/projects/invoice-aggregation/target/invoice-aggregation-0.1.jarNo没有,但我不明白为什么我需要它!您的hadoop和hbase是否在同一台机器上?