Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/python-3.x/15.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
sqoop自由格式查询导入到HBase不工作_Hbase_Sqoop - Fatal编程技术网

sqoop自由格式查询导入到HBase不工作

sqoop自由格式查询导入到HBase不工作,hbase,sqoop,Hbase,Sqoop,我已将sqoop表导入到HBase,如下所示: sqoop导入--connect jdbc:mysql://${mysql服务器地址}/test-用户名root-密码admin-表学生--hbase创建表--hbase表学生--column family i 下一步,我将尝试使自由形式查询也能工作,someow,下面我尝试的sqoop命令没有按预期工作,没有从源表导入到目标HBase表 sqoop导入--connect jdbc:mysql://${mysql server address}/t

我已将sqoop表导入到HBase,如下所示:

sqoop导入--connect jdbc:mysql://${mysql服务器地址}/test-用户名root-密码admin-表学生--hbase创建表--hbase表学生--column family i

下一步,我将尝试使自由形式查询也能工作,someow,下面我尝试的sqoop命令没有按预期工作,没有从源表导入到目标HBase表

sqoop导入--connect jdbc:mysql://${mysql server address}/test-用户名root-密码admin--query'SELECT id,name from Student where$CONDITIONS'--按Student.id拆分--hbase创建表--hbase表Student--列族i

第二个sqoop命令中是否缺少任何内容?该文档在HBase导入方面非常有限

以下是命令2的日志(如果有帮助):

13/08/06 21:15:43 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(t1.id), MAX(t1.id) FROM (SELECT * from Student where  (1 = 1) ) AS t1
13/08/06 21:15:46 INFO mapred.JobClient: Running job: job_201308061021_0025
13/08/06 21:15:47 INFO mapred.JobClient:  map 0% reduce 0%
13/08/06 21:19:08 INFO mapred.JobClient:  map 75% reduce 0%
13/08/06 21:19:09 INFO mapred.JobClient:  map 100% reduce 0%
13/08/06 21:19:12 INFO mapred.JobClient: Job complete: job_201308061021_0025
13/08/06 21:19:12 INFO mapred.JobClient: Counters: 17
13/08/06 21:19:12 INFO mapred.JobClient:   Job Counters
13/08/06 21:19:12 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=212866
13/08/06 21:19:12 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
13/08/06 21:19:13 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
13/08/06 21:19:13 INFO mapred.JobClient:     Launched map tasks=4
13/08/06 21:19:13 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
13/08/06 21:19:13 INFO mapred.JobClient:   File Output Format Counters
13/08/06 21:19:13 INFO mapred.JobClient:     Bytes Written=0
13/08/06 21:19:13 INFO mapred.JobClient:   FileSystemCounters
13/08/06 21:19:13 INFO mapred.JobClient:     HDFS_BYTES_READ=441
13/08/06 21:19:13 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=362752
13/08/06 21:19:13 INFO mapred.JobClient:   File Input Format Counters
13/08/06 21:19:13 INFO mapred.JobClient:     Bytes Read=0
13/08/06 21:19:13 INFO mapred.JobClient:   Map-Reduce Framework
13/08/06 21:19:13 INFO mapred.JobClient:     Map input records=4
13/08/06 21:19:13 INFO mapred.JobClient:     Physical memory (bytes) snapshot=428892160
13/08/06 21:19:13 INFO mapred.JobClient:     Spilled Records=0
13/08/06 21:19:13 INFO mapred.JobClient:     CPU time spent (ms)=7730
13/08/06 21:19:13 INFO mapred.JobClient:     Total committed heap usage (bytes)=312672256
13/08/06 21:19:13 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=5353742336
13/08/06 21:19:13 INFO mapred.JobClient:     Map output records=4
13/08/06 21:19:13 INFO mapred.JobClient:     SPLIT_RAW_BYTES=441
13/08/06 21:19:13 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 213.1239 seconds (0 bytes/sec)
13/08/06 21:19:13 INFO mapreduce.ImportJobBase: Retrieved 4 records.
--按学生拆分。id
应为
--按id拆分

--按学生拆分。id
应为
--按id拆分