Apache spark can';t在ApacheSpark 1.4.1上启动主节点(不带hadoop)

Apache spark can';t在ApacheSpark 1.4.1上启动主节点(不带hadoop),apache-spark,jvm,Apache Spark,Jvm,我正在尝试在我的ec2实例上运行spark 1.4.1而不使用hadoop(spark-1.4.1-bin-without-hadoop.tgz) 这是我的完整日志 Spark Command: /usr/lib/jvm/jre/bin/java -cp /home/ec2-user/spark-1.4.1-bin-without-hadoop/sbin/../conf/:/home/ec2-user/spark-1.4.1-bin-without-hadoop/lib/spark-assemb

我正在尝试在我的ec2实例上运行spark 1.4.1而不使用hadoop(spark-1.4.1-bin-without-hadoop.tgz)

这是我的完整日志

Spark Command: /usr/lib/jvm/jre/bin/java -cp /home/ec2-user/spark-1.4.1-bin-without-hadoop/sbin/../conf/:/home/ec2-user/spark-1.4.1-bin-without-hadoop/lib/spark-assembly-1.4.1-hadoop2.2.0.jar -Xms512m -Xmx512m -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --ip ip-172-31-24-107 --port 7077 --webui-port 8080
========================================
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2625)
        at java.lang.Class.getMethod0(Class.java:2866)
        at java.lang.Class.getMethod(Class.java:1676)
        at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 6 more
如有任何建议,将不胜感激

注意:我在安装spark-1.4.1-bin-hadoop2.6.tgz时绕过了这个问题


谢谢

我在安装spark-1.4.1-bin-hadoop2.6.tgz时绕过了这个问题