Ibm cloud 未找到云类org.apache.oozie.action.hadoop.SparkMain上的BigInsights

Ibm cloud 未找到云类org.apache.oozie.action.hadoop.SparkMain上的BigInsights,ibm-cloud,biginsights,biginsight-examples,Ibm Cloud,Biginsights,Biginsight Examples,我正试图针对ApacheHadoopBasic集群的BigInsights在上执行 workflow.xml如下所示: <workflow-app xmlns='uri:oozie:workflow:0.5' name='SparkWordCount'> <start to='spark-node' /> <action name='spark-node'> <spark xmlns="uri:oozie:spark-action:0.1"&

我正试图针对ApacheHadoopBasic集群的BigInsights在上执行

workflow.xml如下所示:

<workflow-app xmlns='uri:oozie:workflow:0.5' name='SparkWordCount'>
 <start to='spark-node' />
  <action name='spark-node'>
   <spark xmlns="uri:oozie:spark-action:0.1">
    <job-tracker>${jobTracker}</job-tracker>
    <name-node>${nameNode}</name-node>
    <master>${master}</master>
    <name>Spark-Wordcount</name>
    <class>org.apache.spark.examples.WordCount</class>
    <jar>${hdfsSparkAssyJar},${hdfsWordCountJar}</jar>
    <spark-opts>--conf spark.driver.extraJavaOptions=-Diop.version=4.2.0.0</spark-opts>
    <arg>${inputDir}/FILE</arg>
    <arg>${outputDir}</arg>
   </spark>
   <ok to="end" />
   <error to="fail" />
  </action>
  <kill name="fail">
   <message>Workflow failed, error
    message[${wf:errorMessage(wf:lastErrorNode())}]
   </message>
  </kill>
 <end name='end' />
</workflow-app>
我在spark assembly中寻找SparkMain:

$ hdfs dfs -get /iop/apps/4.2.0.0/spark/jars/spark-assembly.jar
$ jar tf spark-assembly.jar | grep -i SparkMain
在这里:

$ jar tf /usr/iop/4.2.0.0/spark/lib/spark-examples-1.6.1_IBM_4-hadoop2.7.2-IBM-12.jar | grep SparkMain

我见过另一个类似的问题,但这个问题是关于BigInsights on cloud的。

这个问题通过以下方式解决:

<property>
    <name>oozie.use.system.libpath</name>
    <value>true</value>
</property>

oozie.use.system.libpath
真的
我应该正确使用RTFM

$ jar tf /usr/iop/4.2.0.0/spark/lib/spark-examples-1.6.1_IBM_4-hadoop2.7.2-IBM-12.jar | grep SparkMain
<property>
    <name>oozie.use.system.libpath</name>
    <value>true</value>
</property>