Apache spark Apache Kafka Spark流异常

Apache spark Apache Kafka Spark流异常,apache-spark,stream,kafka-consumer-api,Apache Spark,Stream,Kafka Consumer Api,我使用下面列出的sbt脚本创建了一个简单的字数计算应用程序: 名称:=火花卡夫卡项目 版本:=1.0 规模厌恶:=2.10.5 libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2" libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.5.2" libraryDependencies += "org.apache.spark" %%

我使用下面列出的sbt脚本创建了一个简单的字数计算应用程序:

名称:=火花卡夫卡项目

版本:=1.0

规模厌恶:=2.10.5

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.5.2"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" % "1.5.2"
我正在使用以下命令运行它:

$SPARK_HOME/bin/spark-submit --packages org.apache.spark:spark-streaming-kafka_2.10:1.5.2 --class "KafkaWordCount" --master local[4] target/scala-2.10/spark-kafka-project_2.10-1.0.jar localhost:2181 a my-first-topic 1
运行该程序将返回以下异常:

17/02/07 12:26:10 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.136.8.39, 60153)
17/02/07 12:26:11 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@d3bf39{/metrics/json,null,AVAILABLE}
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:81)
    at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:61)
    at KafkaWordCount$.main(KafkaWordCount.scala:20)
    at KafkaWordCount.main(KafkaWordCount.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
    at org.apache.spark.deploy.Spar`enter code here`kSubmit$.main(SparkSubmit.scala:124)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
请帮我找出问题所在。

试试看
  • 使用UBER jar启动,包括所有依赖项
  • 在HDFS中放置所有依赖项并为它们提供
    --jar
    参数
  • 将依赖项放置到每个节点中完全相同的目录,并提供 使用
    --jar
    参数对其进行修改

  • 嗨,FaigB,谢谢你的回复。有没有办法创建一个胖jar文件(通过使用sbt汇编插件)?我尝试过使用它,但遇到了如下问题:[info]将当前项目设置为Spark Kafka项目(在生成文件中:/home/Kafka Spark sbt/)[error]无效命令:assembly[error]无效键:assembly[error]assembly[error]您也能在这里提供帮助吗?