Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 无法在oozie中运行“sqoop作业--exec”_Hadoop_Sqoop_Oozie - Fatal编程技术网

Hadoop 无法在oozie中运行“sqoop作业--exec”

Hadoop 无法在oozie中运行“sqoop作业--exec”,hadoop,sqoop,oozie,Hadoop,Sqoop,Oozie,需要一些建议我正试图在oozie中运行sqoop作业,但突然它被杀死了,并且在oozie-error.log中有这个警告 2018-01-21 17:30:12,473 WARN SqoopActionExecutor:523 - SERVER[edge01.domain.com] USER[linknet] GROUP[-] TOKEN[] APP[sqoop-wf] JOB[0000006-180121122345026-oozie-link-W] ACTION[0000006-18012

需要一些建议我正试图在oozie中运行sqoop作业,但突然它被杀死了,并且在oozie-error.log中有这个警告

2018-01-21 17:30:12,473  WARN SqoopActionExecutor:523 - SERVER[edge01.domain.com] USER[linknet] GROUP[-] TOKEN[] APP[sqoop-wf] JOB[0000006-180121122345026-oozie-link-W] ACTION[0000006-180121122345026-oozie-link-W@sqoop-node] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
作业属性

nameNode=hdfs://hadoop01.domain.com:8020
jobTracker=hadoop01.domain.com:18032
queueName=default
oozie.use.system.libpath=true
examplesRoot=examples
oozie.libpath=${nameNode}/share/lib/oozie
oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/sqoop
workflow.xml

<workflow-app xmlns="uri:oozie:workflow:0.2" name="sqoop-wf">
    <start to="sqoop-node"/>

    <action name="sqoop-node">
        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <prepare>
                <delete path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data/sqoop"/>
                <mkdir path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data"/>
            </prepare>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <command>job --exec ingest_cpm_alarm</command>
        </sqoop>
        <ok to="end"/>
        <error to="fail"/>
    </action>

    <kill name="fail">
        <message>Sqoop failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>
我可以成功运行此sqoop作业,但不能在Oozie调度程序中运行。 此外,jar文件postgresql-42.1.4.jar和$SQOOP_HOME/lib下的所有内容都已复制到libpath目录/share/lib/oozie中

Oozie和sqoop位于同一台服务器中。在我的sqoop-site.xml中,我只设置了这些参数

sqoop.metastore.client.enable.autoconnect=true
sqoop.metastore.client.record.password=true
sqoop.metastore.client.record.password=true

我错过了什么吗?

问题解决了,我错过了应该在HDFS的同一工作流目录中可用的sqoop-site.xml

这篇文章也有类似的问题:

谢谢

sqoop.metastore.client.enable.autoconnect=true
sqoop.metastore.client.record.password=true
sqoop.metastore.client.record.password=true