Java 如何将Jar文件传递到OOZIE shell节点中的shell脚本
嗨,我在一个脚本中运行java程序时遇到以下错误,该脚本将在oozie shell操作工作流中执行Java 如何将Jar文件传递到OOZIE shell节点中的shell脚本,java,shell,hadoop,hdfs,oozie,Java,Shell,Hadoop,Hdfs,Oozie,嗨,我在一个脚本中运行java程序时遇到以下错误,该脚本将在oozie shell操作工作流中执行 Stdoutput 2015-08-25 03:36:02,636 INFO [pool-1-thread-1] (ProcessExecute.java:68) - Exception in thread "main" java.io.IOException: Error opening job jar: /tmp/jars/first.jar Stdoutput 2015-08-25 03:
Stdoutput 2015-08-25 03:36:02,636 INFO [pool-1-thread-1] (ProcessExecute.java:68) - Exception in thread "main" java.io.IOException: Error opening job jar: /tmp/jars/first.jar
Stdoutput 2015-08-25 03:36:02,636 INFO [pool-1-thread-1] (ProcessExecute.java:68) - at org.apache.hadoop.util.RunJar.main(RunJar.java:124)
Stdoutput 2015-08-25 03:36:02,636 INFO [pool-1-thread-1] (ProcessExecute.java:68) - Caused by: java.io.FileNotFoundException: /tmp/jars/first.jar (No such file or directory)
Stdoutput 2015-08-25 03:36:02,636 INFO [pool-1-thread-1] (ProcessExecute.java:68) - at java.util.zip.ZipFile.open(Native Method)
Stdoutput 2015-08-25 03:36:02,637 INFO [pool-1-thread-1] (ProcessExecute.java:68) - at java.util.zip.ZipFile.<init>(ZipFile.java:215)
Stdoutput 2015-08-25 03:36:02,637 INFO [pool-1-thread-1] (ProcessExecute.java:68) - at java.util.zip.ZipFile.<init>(ZipFile.java:145)
Stdoutput 2015-08-25 03:36:02,637 INFO [pool-1-thread-1] (ProcessExecute.java:68) - at java.util.jar.JarFile.<init>(JarFile.java:154)
Stdoutput 2015-08-25 03:36:02,637 INFO [pool-1-thread-1] (ProcessExecute.java:68) - at java.util.jar.JarFile.<init>(JarFile.java:91)
Stdoutput 2015-08-25 03:36:02,640 INFO [pool-1-thread-1] (ProcessExecute.java:68) - at org.apache.hadoop.util.RunJar.main(RunJar.java:122)
Exit code of the Shell command 1
workflow.xml
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<workflow-app name="test" xmlns="uri:oozie:workflow:0.4">
<start to="first" />
<action name="first">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<exec>script</exec>
<argument>-type mine</argument>
<argument>-cfg config.cfg</argument>
<file>script</file>
<file>${EXEC}#${EXEC}</file>
<file>config.cfg</file>
<file>first.jar#first.jar</file>
<file>second.jar#second.jar</file>
</shell>
<ok to="end" />
<error to="fail" />
</action>
<kill name="fail">
<message>Workflow failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end" />
</workflow-app>
我从/tmp/jars目录获取execution.jar文件中的first.jar,原因是该目录不会为oozie工作流用户创建任何权限问题
任何指导/建议都会非常有用
我的问题是:
- 我想在oozie shell操作节点中执行一个脚本
- 从oozie shell操作节点执行的脚本将运行java程序
- 基于参数的java程序将运行first.jar或second.jar
- 任何Oozie操作都不能引用的本地文件系统上的内容 节点,其中它只能引用HDFS上的内容
- Java二进制命令只能引用本地文件系统上的文件
将这一智慧转化为务实行动:1。将JAR上载到HDFS目录2。告诉Oozie将JAR下载到纱线容器中,用于执行
/user/johndoe/some/dir/first.JAR
(如果不需要重命名文件,#是无用的)3。将shell脚本更改为当前工作目录中的JAR,例如,java-cp./first.JAR无论什么
@Nitin Mahesh,您是否能够修复此问题?
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<workflow-app name="test" xmlns="uri:oozie:workflow:0.4">
<start to="first" />
<action name="first">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<exec>script</exec>
<argument>-type mine</argument>
<argument>-cfg config.cfg</argument>
<file>script</file>
<file>${EXEC}#${EXEC}</file>
<file>config.cfg</file>
<file>first.jar#first.jar</file>
<file>second.jar#second.jar</file>
</shell>
<ok to="end" />
<error to="fail" />
</action>
<kill name="fail">
<message>Workflow failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end" />
</workflow-app>
#!/bin/bash
#get the user who executed the script
EXECUTING_USER="user1"
# get start time
NOW=$(date +"%T")
#get the host name
HOST="$HOSTNAME"
ARGUMENTSTRING="$@ -user user1 -startTime $NOW"
echo "Passing the following arguments : $ARGUMENTSTRING"
java -cp execution.jar com.hadoop.test.Main "$ARGUMENTSTRING"
exit $?