Hbase导入错误

Hbase导入错误,hbase,hadoop2,Hbase,Hadoop2,我正在尝试导入一个表,该表是从运行在0.98.4上的另一个hbase导出的。我导出如下- hbase org.apache.hadoop.hbase.mapreduce.Driver export 'tblname' /path/ 我正在尝试导入这个表,它已经使用hadoop fs-put放入hdfs中。当我运行下面的import命令时,它给出了一个错误- hbase org.apache.hadoop.hbase.mapreduce.Driver导入'tblname'/hdfs/path

我正在尝试导入一个表,该表是从运行在0.98.4上的另一个hbase导出的。我导出如下-

 hbase org.apache.hadoop.hbase.mapreduce.Driver export 'tblname' /path/
我正在尝试导入这个表,它已经使用hadoop fs-put放入hdfs中。当我运行下面的import命令时,它给出了一个错误-

hbase org.apache.hadoop.hbase.mapreduce.Driver导入'tblname'/hdfs/path

2015-06-24 02:19:24,492 ERROR [main] security.UserGroupInformation: PriviledgedActionException as:deeshank (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: hdfs://localhost:54310/home/deeshank/DB/hbase_home/lib/hadoop-mapreduce-client-core-2.2.0.jar
Exception in thread "main" java.io.FileNotFoundException: File does not exist: hdfs://localhost:54310/home/deeshank/DB/hbase_home/lib/hadoop-mapreduce-client-core-2.2.0.jar
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1110)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:264)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:300)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:387)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
at org.apache.hadoop.hbase.mapreduce.Import.main(Import.java:535)

我不确定是什么导致了这个问题。我正在运行haddop-2.6.0版本。

hdfs://localhost:54310/ 是hadoop hdfs地址。您可以在应用程序中更改属性,也可以在hdfs上上载jar

您可以显示linux文件系统的ls命令,并且可以使用以下命令:

“hdfs dfs-lshdfs://localhost:9000/"


但是hdfs://localhost:9000/ 是hadoop hdfs文件系统的地址。

为什么要在hdfs中搜索JAR?特别是在hdfs://home/.. 我应该在哪里上传jar?我想你需要设置jar文件的类路径。你可以帮助我详细了解如何上传jar文件吗?阅读本文,我需要在import命令中给出jar文件的类路径?