Apache spark SAP HANA Vora shell罐头盒';找不到org.apache.spark.launcher.Main

Apache spark SAP HANA Vora shell罐头盒';找不到org.apache.spark.launcher.Main,apache-spark,sap,cloudera,hana,vora,Apache Spark,Sap,Cloudera,Hana,Vora,我目前正在Cloudera Express 5.5.0上使用SAP HANA Vora进行第一步 Vora服务器已启动并运行,我现在想使用Vora spark shell,但我得到的是: sh start-spark-shell.sh Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/Main Caused by: java.lang.ClassNotFoundExcepti

我目前正在Cloudera Express 5.5.0上使用SAP HANA Vora进行第一步

Vora服务器已启动并运行,我现在想使用Vora spark shell,但我得到的是:

sh start-spark-shell.sh 
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/Main
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.Main
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.spark.launcher.Main.  Program will exit.
这就是我的环境的样子:

export LD_LIBRARY_PATH=/opt/cloudera/parcels/CDH/lib/hadoop/lib/native
export JAVA_HOME=/usr/java/default
export HADOOP_PARCEL_PATH=/opt/cloudera/parcels/CDH
export HADOOP_CONF_DIR=/etc/hadoop/conf
export SPARK_HOME=/usr/lib/spark
export SPARK_CONF_DIR=$SPARK_HOME/conf
export PATH=$PATH:$SPARK_HOME/bin

SPARK_DIST_CLASSPATH=$SPARK_HOME/lib/spark-assembly.jar
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/hadoop/lib/*
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/hadoop/*
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/hadoop-hdfs/lib/*
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/hadoop-hdfs/*
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/hadoop-mapreduce/lib/*
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/hadoop-mapreduce/*
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/hadoop-yarn/lib/*
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/hadoop-yarn/*
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/hive/lib/*
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/flume-ng/lib/*
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/parquet/lib/*
SPARK_DIST_CLASSPATH=$SPARK_DIST_CLASSPATH:$HADOOP_PARCEL_PATH/lib/avro/lib/*
export SPARK_DIST_CLASSPATH
解决了

只需要将Java从JDK6升级到JDK7。确保设置了以下环境变量(请在安装时检查这些值):

谢谢你的回答。 我发现JDK7已经安装在/usr/java/jdk1.7.0_67-cloudera目录下。我认为这是Cloudera Manager安装的一个步骤。撰写本文时,CDH 5.3、5.5、5.6的最低支持版本为1.7.0_55(例如,请参阅)

我尝试使用stanard(非Vora)spark shell,在CDH上也遇到了同样的问题。因此,JDK7也需要用于标准火花壳。Vora spark shell脚本

$VORA_SPARK_HOME/bin/start-spark-shell.sh
…只需添加Vora datasources jar作为附加库

仅供参考,以下是Cloudera CDH上标准火花壳的示例

~> cd /usr/java
/usr/java> ls -l
total 8
lrwxrwxrwx 1 lroot root   16 Dec 17  2015 default -> /usr/java/latest
drwxr-xr-x 9 lroot root 4096 Dec 17  2015 jdk1.6.0_31
drwxr-xr-x 8 lroot root 4096 Dec 17  2015 jdk1.7.0_67-cloudera
lrwxrwxrwx 1 lroot root   21 Dec 17  2015 latest -> /usr/java/jdk1.6.0_31
/usr/java> export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera
/usr/java> export SPARK_HOME=/usr/lib/spark
/usr/java> cd $SPARK_HOME
/usr/lib/spark> ./bin/spark-shell
仅供参考,我还有霍顿工厂的Vora。Java7已经在通过/usr/bin/Java sybolic链接的路径上运行了,这刚刚起作用

source /etc/vora/vora-env.sh
$VORA_SPARK_HOME/bin/start-spark-shell.sh
source /etc/vora/vora-env.sh
$VORA_SPARK_HOME/bin/start-spark-shell.sh