Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/search/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 使用Sqoop2将数据导入HDFS_Hadoop_Sqoop - Fatal编程技术网

Hadoop 使用Sqoop2将数据导入HDFS

Hadoop 使用Sqoop2将数据导入HDFS,hadoop,sqoop,Hadoop,Sqoop,根据官方指南, ,我成功创建了一个作业 但是,当我执行命令时,提交开始--jid1, 我收到了以下错误消息: Exception has occurred during processing command Server has returned exception: Exception: java.lang.Throwable Message: GENERIC_JDBC_CONNECTOR_0002:Unable to execute the SQL statement 这是我的工作信息

根据官方指南, ,我成功创建了一个作业

但是,当我执行命令时,
提交开始--jid1
, 我收到了以下错误消息:

Exception has occurred during processing command 
Server has returned exception: Exception: java.lang.Throwable Message: GENERIC_JDBC_CONNECTOR_0002:Unable to execute the SQL statement
这是我的工作信息

数据库配置

Schema name: invoice
Table name: ds_msg_log
Table SQL statement: 
Table column names: *
Partition column name: 
Boundary query: 
Storage type: HDFS
Output format: TEXT_FILE
Output directory: /user/root/ds_msg_log
Schema name:  xx

Table name:  xxx

Table SQL statement: 

Table column names: one, two, thre

Partition column name: one

Boundary query: 
Storage type: HDFS

Output format: TEXT_FILE

Output directory: /devanms/
输出配置

Schema name: invoice
Table name: ds_msg_log
Table SQL statement: 
Table column names: *
Partition column name: 
Boundary query: 
Storage type: HDFS
Output format: TEXT_FILE
Output directory: /user/root/ds_msg_log
Schema name:  xx

Table name:  xxx

Table SQL statement: 

Table column names: one, two, thre

Partition column name: one

Boundary query: 
Storage type: HDFS

Output format: TEXT_FILE

Output directory: /devanms/
限制资源

Extractors: 
Loaders: 
Extractors: 

Loaders: 
由于官方指南中没有关于如何设置上述值的信息,有人知道我的工作设置有什么错误吗

这是日志:

Stack trace:
     at  org.apache.sqoop.connector.jdbc.GenericJdbcExecutor (GenericJdbcExecutor.java:59)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:155)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:48)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:37)  
     at  org.apache.sqoop.framework.FrameworkManager (FrameworkManager.java:447)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:112)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:98)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:68)  
     at  org.apache.sqoop.server.v1.SubmissionServlet (SubmissionServlet.java:44)  
     at  org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:63)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:637)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:717)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)  
     at  org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233)  
     at  org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191)  
     at  org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127)  
     at  org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:102)  
     at  org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109)  
     at  org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)  
     at  org.apache.coyote.http11.Http11Processor (Http11Processor.java:859)  
     at  org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:602)  
     at  org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)  
     at  java.lang.Thread (Thread.java:724)  
Caused by: Exception: java.lang.Throwable Message: ERROR: schema "invoice" does not exist
  Position: 46
Stack trace:
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:2102)  
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:1835)  
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:257)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:500)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:374)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:254)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcExecutor (GenericJdbcExecutor.java:56)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:155)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:48)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:37)  
     at  org.apache.sqoop.framework.FrameworkManager (FrameworkManager.java:447)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:112)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:98)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:68)  
     at  org.apache.sqoop.server.v1.SubmissionServlet (SubmissionServlet.java:44)  
     at  org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:63)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:637)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:717)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)  
     at  org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233)  
     at  org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191)  
     at  org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127)  
     at  org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:102)  
     at  org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109)  
     at  org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)  
     at  org.apache.coyote.http11.Http11Processor (Http11Processor.java:859)  
     at  org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:602)  
     at  org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)  
     at  java.lang.Thread (Thread.java:724)  
“表列名”中的值“*”不是必需的,因为默认值是“所有列”。如果您可以共享服务器日志以查看出了什么问题,这也会很有帮助

通过将shell切换到以下位置,您可以获得其他信息,例如异常的整个堆栈跟踪:

表列名称:*

不能使用*,请改用逗号分隔的列名。 您应该指定一个列名作为分区列,您可以使用任何列进行分区。(用于将导入作业分离/拆分为多个任务以进行并行处理)。 您可以将未记录的参数保留为null。 给出用于选择hdfs(存储)和文件格式(序列文件/文本文件)的整数

下面是创建的示例作业(show job--jid yourjob id)

sqoop:000>显示作业--jid146

1个作业显示:

id为146且名称为ImportJob的作业(创建于2013年10月10日下午3:46,更新于2013年10月10日下午3:46)

使用连接id 149和连接器id 1

数据库配置

Schema name: invoice
Table name: ds_msg_log
Table SQL statement: 
Table column names: *
Partition column name: 
Boundary query: 
Storage type: HDFS
Output format: TEXT_FILE
Output directory: /user/root/ds_msg_log
Schema name:  xx

Table name:  xxx

Table SQL statement: 

Table column names: one, two, thre

Partition column name: one

Boundary query: 
Storage type: HDFS

Output format: TEXT_FILE

Output directory: /devanms/
输出配置

Schema name: invoice
Table name: ds_msg_log
Table SQL statement: 
Table column names: *
Partition column name: 
Boundary query: 
Storage type: HDFS
Output format: TEXT_FILE
Output directory: /user/root/ds_msg_log
Schema name:  xx

Table name:  xxx

Table SQL statement: 

Table column names: one, two, thre

Partition column name: one

Boundary query: 
Storage type: HDFS

Output format: TEXT_FILE

Output directory: /devanms/
限制资源

Extractors: 
Loaders: 
Extractors: 

Loaders: 

以下是我的sqoop java客户端博客: