Hadoop 具有带where子句的SQL查询和并行处理的Sqoop导入

Hadoop 具有带where子句的SQL查询和并行处理的Sqoop导入,hadoop,bigdata,sqoop,Hadoop,Bigdata,Sqoop,我在mysql中有一个如下表: 订单详情: +---------+------------+-------------------+--------------+ | orderid | orde

我在mysql中有一个如下表:

订单详情:

+---------+------------+-------------------+--------------+                                                                                                                                                        
| orderid | order_date | order_customer_id | order_status |                                                                                                                                                        
+---------+------------+-------------------+--------------+                                                                                                                                                        
| A001    | 10/30/2018 | C003              | Completed    |                                                                                                                                                        
| A002    | 10/30/2018 | C005              | Completed    |                                                                                                                                                        
| A451    | 11/02/2018 | C376              | Pending      |                                                                                                                                                        
| P9209   | 10/30/2018 | C234              | Completed    |                                                                                                                                                        
| P92099  | 10/30/2018 | C244              | Pending      |                                                                                                                                                        
| P9210   | 10/30/2018 | C035              | Completed    |                                                                                                                                                        
| P92398  | 10/30/2018 | C346              | Pending      |                                                                                                                                                        
| P9302   | 10/30/2018 | C034              | Completed    |                                                                                                                                                        
+---------+------------+-------------------+--------------+ 
其说明如下:

mysql> desc Order_Details_Sankha;                                                                                                                                                                                  
+-------------------+-------------+------+-----+---------+-------+                                                                                                                                                 
| Field             | Type        | Null | Key | Default | Extra |                                                                                                                                                 
+-------------------+-------------+------+-----+---------+-------+                                                                                                                                                 
| orderid           | varchar(20) | NO   | PRI |         |       |                                                                                                                                                 
| order_date        | varchar(20) | YES  |     | NULL    |       |                                                                                                                                                 
| order_customer_id | varchar(20) | YES  |     | NULL    |       |                                                                                                                                                 
| order_status      | varchar(20) | YES  |     | NULL    |       |                                                                                                                                                 
+-------------------+-------------+------+-----+---------+-------+  
我正在使用以下带有并行处理的sqoop导入:

sqoop import 
--connect jdbc:mysql://ip-10-0-1-10.ec2.internal/06july2018_new 
--username labuser 
--password abc123
--driver com.mysql.jdbc.Driver 
--query "select * from Order_Details where order_date = '10/30/2018'  AND \$CONDITIONS" 
--target-dir /user/sankha087_gmail_com/outputs/EMP_Sankha_1112201888 
--split-by ","   
--m 3
我收到下面的错误消息

18/12/15 17:15:14 WARN security.UserGroupInformation: PriviledgedActionException as:sankha087_gmail_com (auth:SIMPLE) cause:java.io.IOException: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You hav
e an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '), MAX(,) FROM (select * from Order_Details_Sankha where order_date = '10/30/201' a
t line 1                                                                                                                                                                                                           
18/12/15 17:15:14 ERROR tool.ImportTool: Import failed: java.io.IOException: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to 
your MySQL server version for the right syntax to use near '), MAX(,) FROM (select * from Order_Details_Sankha where order_date = '10/30/201' at line 1                                                            
        at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:207)                                                                                                       
        at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:305)                                                                                                                          
        at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:322)                                                                                                                             
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:200)                                                                                                                       
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)   

请告知我的导入语句中需要更改的内容。Sqoop并行执行不会发生在垂直拆分中,而是发生在水平拆分中

--按拆分应为列名。柱应为均匀分布的柱


改为:7.2.4。控制并行性

如果您在查询中使用的映射器数量超过了您不需要使用的数量——由clauseThanks Gaurang拆分,它可以工作:)。我在--split by中给出了主键列。但请告诉我,在什么情况下,我可以使用--split by”,“谢谢Mahesh的回答,我尝试只给出地图编号,然后它又特别要求--split by这是不正确的。如果使用多个映射器,则必须指定
--split by
,因为sqoop无法根据需要为多个映射器划分哪些表来理解。不仅当您使用单个映射器
-m1
Hi Gaurang时才需要它,您是绝对正确的,使用您的解决方案,我能够成功地执行它。除此之外,我执行了下面的导入,其中我为映射器1指定了--split by“”,并且它工作正常。