org.apache.hadoop.hbase.regiontobusyException

org.apache.hadoop.hbase.regiontobusyException,hadoop,hbase,Hadoop,Hbase,我正在尝试使用hive Hbase集成将30亿条记录(ORC文件)从hive加载到Hbase 配置单元创建表DDL CREATE EXTERNAL TABLE cs.account_dim_hbase(`account_number` string,`encrypted_account_number` string,`affiliate_code` string,`alternate_party_name` string, `alternate_party_name` string) STORE

我正在尝试使用hive Hbase集成将30亿条记录(ORC文件)从hive加载到Hbase

配置单元创建表DDL

CREATE EXTERNAL TABLE cs.account_dim_hbase(`account_number` string,`encrypted_account_number` string,`affiliate_code` string,`alternate_party_name` string, `alternate_party_name` string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping"=":key,account_dim:encrypted_account_number,account_dim:affiliate_code,account_dim:alternate_party_name")TBLPROPERTIES ("hbase.table.name" = "default:account_dim");
配置单元Insert查询到HBase,我正在运行类似于下面示例的128 Insert命令

insert  into table cs.account_dim_hbase  select account_number ,encrypted_account_number ,    affiliate_code ,alternate_party_name,mod_account_number from cds.account_dim where mod_account_number=1;
当我试图同时运行所有128个插入时,我得到以下错误

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 438 actions: org.apache.hadoop.hbase.RegionTooBusyException: Over memstore limit=2.0G, regionName=jhgjhsdgfjgsdjf, server=cldf0007.com

帮我解决这个问题,如果我做错了什么,请告诉我。我正在使用HDP 3在rowkey字段上使用MD5哈希从hive加载数据,并使用区域拆分创建HBASE表。现在每个分区只需5分钟就可以加载数据(之前是20分钟,但有例外,但现在已修复)

create ‘users, ‘usercf’, SPLITS=›
['10000000000000000000000000000000',
'20000000000000000000000000000000',
'30000000000000000000000000000000',
'40000000000000000000000000000000',
'50000000000000000000000000000000',
'60000000000000000000000000000000',
'70000000000000000000000000000000',
'80000000000000000000000000000000',
'90000000000000000000000000000000',
'a0000000000000000000000000000000',
'b0000000000000000000000000000000',
'c0000000000000000000000000000000',
'd0000000000000000000000000000000',
'e0000000000000000000000000000000',
'f0000000000000000000000000000000']