Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/sqlite/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop HDP上的HBase增量备份失败_Hadoop_Hbase_Backup_Hortonworks Data Platform - Fatal编程技术网

Hadoop HDP上的HBase增量备份失败

Hadoop HDP上的HBase增量备份失败,hadoop,hbase,backup,hortonworks-data-platform,Hadoop,Hbase,Backup,Hortonworks Data Platform,在HBase中创建了一个“测试”表,以测试HDP上的增量备份功能 hbase(main):002:0> create 'test', 'cf' 0 row(s) in 1.4690 seconds hbase(main):003:0> put 'test', 'row1', 'cf:a', 'value1' 0 row(s) in 0.1480 seconds hbase(main):004:0> put 'test', 'row2'

在HBase中创建了一个“测试”表,以测试HDP上的增量备份功能

    hbase(main):002:0> create 'test', 'cf'
    0 row(s) in 1.4690 seconds

    hbase(main):003:0> put 'test', 'row1', 'cf:a', 'value1'
    0 row(s) in 0.1480 seconds

    hbase(main):004:0> put 'test', 'row2', 'cf:b', 'value2'
    0 row(s) in 0.0070 seconds

    hbase(main):005:0> put 'test', 'row3', 'cf:c', 'value3'
    0 row(s) in 0.0120 seconds

    hbase(main):006:0> put 'test', 'row3', 'cf:c', 'value4'
    0 row(s) in 0.0070 seconds

    hbase(main):010:0> scan 'test'       
    ROW                   COLUMN+CELL                                               
    row1                 column=cf:a, timestamp=1317945279379, value=value1        
    row2                 column=cf:b, timestamp=1317945285731, value=value2        
    row3                 column=cf:c, timestamp=1317945301466, value=value4        
    3 row(s) in 0.0250 seconds
现在,我已经采取了一个完整的备份使用下面在它的成功

hbase backup create full hdfs://12.3.4.56:8020/tmp/full test -w 3
现在,我想在上面的“测试”表上测试“增量”备份。所以我所做的是:

put 'test', 'row123', 'cf:a', 'newValue'
现在,当我在做下面的事情时,它会变得模糊

hbase backup create incremental hdfs://12.3.4.56:8020/tmp/full
错误:

Backup session finished. Status: FAILURE
2017-06-14 09:52:58,853 ERROR [main] util.AbstractHBaseTool: Error running command-line tool
org.apache.hadoop.ipc.RemoteException(java.lang.NullPointerException):
        at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.cleanupTargetDir(FullTableBackupProcedure.java:205)
        at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.failBackup(FullTableBackupProcedure.java:279)
        at org.apache.hadoop.hbase.backup.master.IncrementalTableBackupProcedure.executeFromState(IncrementalTableBackupProcedure.java:164)
        at org.apache.hadoop.hbase.backup.master.IncrementalTableBackupProcedure.executeFromState(IncrementalTableBackupProcedure.java:54)
        at org.apache.hadoop.hbase.procedure2.StateMachineProcedure.execute(StateMachineProcedure.java:107)
        at org.apache.hadoop.hbase.procedure2.Procedure.doExecute(Procedure.java:443)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execProcedure(ProcedureExecutor.java:934)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:736)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:689)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.access$200(ProcedureExecutor.java:73)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor$1.run(ProcedureExecutor.java:416)
更新:

在下面的链接中提到“备份和恢复应作为hbase超级用户运行(默认情况下称为“hbase”)。这是什么意思?我只是从具有root访问权限的简单用户运行上面的back命令。请建议


我试图更改hdfs文件(tmp/full)的权限,但没有用。

我正在使用Kerberos,因此在将kinit作为运行hbase的主体之后,增量备份对我起到了作用


如果您不使用Kerberos,请首先切换到HBase用户(例如“su-HBase”)。

对于非Kerberized集群,无需
su
——只需设置一个env变量:
export HADOOP\u user\u NAME=HBase
您能告诉我们正在尝试遵循的HBase和HADOOP版本,但出现错误“找不到备份类”吗