Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 火花流+;卡夫卡sbt汇编_Apache Spark_Apache Kafka_Sbt_Spark Streaming_Spark Streaming Kafka - Fatal编程技术网

Apache spark 火花流+;卡夫卡sbt汇编

Apache spark 火花流+;卡夫卡sbt汇编,apache-spark,apache-kafka,sbt,spark-streaming,spark-streaming-kafka,Apache Spark,Apache Kafka,Sbt,Spark Streaming,Spark Streaming Kafka,我有一个spark streaming+kafka的例子。它在IDE中运行良好。但是当我试图从控制台通过SBT编译它时,比如SBT编译。我犯了一个错误 主要类别: val conf = new SparkConf().setMaster("local[*]").setAppName("KafkaReceiver") val ssc = new StreamingContext(conf, Seconds(5)) val kafkaStream1 = KafkaUtils.createS

我有一个spark streaming+kafka的例子。它在IDE中运行良好。但是当我试图从控制台通过SBT编译它时,比如SBT编译。我犯了一个错误

主要类别:

val conf = new SparkConf().setMaster("local[*]").setAppName("KafkaReceiver")
  val ssc = new StreamingContext(conf, Seconds(5))

  val kafkaStream1 = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("t1" -> 5))
  //val kafkaStream2 = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("topic2" -> 5))

  //kafkaStream.fla
  kafkaStream1.print()
  ssc.start()
  ssc.awaitTermination()
错误消息:

[error] bad symbolic reference. A signature in package.class refers to type compileTimeOnly
[error] in package scala.annotation which is not available.
[error] It may be completely missing from the current classpath, or the version on
[error] the classpath might be incompatible with the version used when compiling package.class.
Reference to method any2ArrowAssoc in object Predef should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error]   val kafkaStream1 = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("t1" -> 5))
[error]                                                                                                           ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
sbt:


你有办法解决它吗

你能分享你的build.sbt吗。可能导致“错误符号引用”问题的原因之一是scala版本不匹配。有关该问题的更多详细信息,请参阅此页。另外,请确保您使用的Scala版本与spark预期的版本相同,有关更多详细信息,请参阅。您是否可以共享您的build.sbt。可能导致“错误符号引用”问题的原因之一是scala版本不匹配。关于这个问题的更多细节,请看这个。另外,请确保您使用的Scala版本与spark预期的版本相同,查看更多细节

这可能会提供一些线索:这可能会提供一些线索:我应该使用Scala 2.10吗?这取决于您使用的spark版本,例如spark 1.3.1使用Scala 2.10.4,但是Spark 2.0使用的是Scala 2.11.*,请看一下Spark版本的文档。您使用的Spark版本真的很奇怪。我使用spark 2.0.0和scala 2.11.8。更新的build.sbt使用了本手册。在IDE中工作,但从命令行“sbt compile”有相同的错误…我应该使用scala 2.10吗?这取决于您使用的Spark版本,例如Spark 1.3.1使用scala 2.10.4,但Spark 2.0使用scala 2.11。*,请查看Spark版本的文档,您使用的Spark真的很奇怪。我使用spark 2.0.0和scala 2.11.8。更新的build.sbt使用了本手册。在IDE工作,但从命令行“sbt编译”有相同的错误。。。
    name := "test"
    val sparkVersion = "2.0.0"

    lazy val commonSettings = Seq(
      organization := "com.test",
      version := "1.0",
      scalaVersion := "2.11.8",
      test in assembly := {}
    )    
libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-streaming_2.11" % sparkVersion,
  "org.apache.spark" % "spark-streaming-kafka-0-8_2.11" % sparkVersion
)