Apache kafka Apache Kafka生产者配置错误

Apache kafka Apache Kafka生产者配置错误,apache-kafka,kafka-producer-api,Apache Kafka,Kafka Producer Api,参考Apache Kafka的0.9.0.0版本,根据生产者配置文档: 我需要使用以下属性指定经纪人列表: props.put("bootstrap.servers", "localhost:9092") 这是我的制作人课程: def main(args: Array[String]) { //val conf = new SparkConf().setAppName("VPP metrics producer") //val sc = new SparkContext(

参考Apache Kafka的0.9.0.0版本,根据生产者配置文档:

我需要使用以下属性指定经纪人列表:

props.put("bootstrap.servers", "localhost:9092")
这是我的制作人课程:

  def main(args: Array[String]) {
    //val conf = new SparkConf().setAppName("VPP metrics producer")
    //val sc = new SparkContext(conf)

    val props: Properties = new Properties()
      props.put("bootstrap.servers", "localhost:9092")
      props.put("key.serializer", "kafka.serializer.StringEncoder")
      props.put("value.serializer", "kafka.serializer.StringEncoder")

    val config = new ProducerConfig(props)
    val producer = new Producer[String, String](config)

    1 to 10000 foreach {
      case i => 
        val jsonStr = getRandomTsDataPoint().toJson.toString()
        println(s"sending message $i to kafka")
        producer.send(new KeyedMessage[String, String]("test_topic", jsonStr))
        println(s"sent message $i to kafka")
    }
  }
这是我的依赖:

object Dependencies {
  val resolutionRepos = Seq(
    "Spray Repository" at "http://repo.spray.cc/"
  )

  object V {
    val spark     = "1.6.0"
    val kafka     = "0.9.0.0"
    val jodaTime  = "2.7"
    val sprayJson = "1.3.2"
    // Add versions for your additional libraries here...
  }

  object Libraries {
    val sparkCore   = "org.apache.spark"           %% "spark-core"            % V.spark 
    val kafka       = "org.apache.kafka"           %% "kafka"                 % V.kafka
    val jodaTime    = "joda-time"                  % "joda-time"              % V.jodaTime
    val sprayJson   = "io.spray"                   %% "spray-json"            % V.sprayJson
  }
}
如您所见,我使用的是ApacheKafka的0.9.0.0版本。当我尝试运行Producer类时,出现以下错误:

Joes-MacBook-Pro:spark-kafka-producer joe$ java -cp target/scala-2.11/spark-example-project-0.1.0-SNAPAHOT.jar com.eon.vpp.MetricsProducer
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Missing required property 'metadata.broker.list'
    at scala.Predef$.require(Predef.scala:219)
    at kafka.utils.VerifiableProperties.getString(VerifiableProperties.scala:177)
    at kafka.producer.ProducerConfig.<init>(ProducerConfig.scala:66)
    at kafka.producer.ProducerConfig.<init>(ProducerConfig.scala:56)
    at com.eon.vpp.MetricsProducer$.main(MetricsProducer.scala:45)
    at com.eon.vpp.MetricsProducer.main(MetricsProducer.scala)
Joes MacBook Pro:spark kafka制作人joe$java-cp target/scala-2.11/spark-example-project-0.1.0-SNAPAHOT.jar com.eon.vpp.MetricsProducer
线程“main”java.lang.IllegalArgumentException中出现异常:要求失败:缺少必需的属性“metadata.broker.list”
在scala.Predef$.require处(Predef.scala:219)
位于kafka.utils.VerifiableProperties.getString(VerifiableProperties.scala:177)
在卡夫卡。producer.ProducerConfig。(ProducerConfig.scala:66)
在卡夫卡。producer.ProducerConfig。(ProducerConfig.scala:56)
在com.eon.vpp.MetricsProducer$.main上(MetricsProducer.scala:45)
位于com.eon.vpp.MetricsProducer.main(MetricsProducer.scala)

为什么会这样?我甚至验证了jar文件的内容,它使用了ApacheKafka的0.9.0.0版本!(kafka_2.11-0.9.0.0.jar)

Spark 1.6.0目前不支持kafka 0.9。您必须等到Spark 2.0.0。
检查此问题:

如果将
引导程序.servers
替换为
元数据.broker.list
,是否有效?