Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala IntelliJ无法解析符号订阅卡夫卡_Scala_Apache Spark_Intellij Idea - Fatal编程技术网

Scala IntelliJ无法解析符号订阅卡夫卡

Scala IntelliJ无法解析符号订阅卡夫卡,scala,apache-spark,intellij-idea,Scala,Apache Spark,Intellij Idea,我有IntelliJ中的下一个项目,问题是Subscribe内部KafkaUtils.createDirectStream显示为红色,它抛出无法解析符号Subscribe,但我添加了所有kafka spark库: import org.apache.spark.streaming.{Seconds, StreamingContext} import org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe

我有IntelliJ中的下一个项目,问题是
Subscribe
内部
KafkaUtils.createDirectStream
显示为红色,它抛出
无法解析符号Subscribe
,但我添加了所有kafka spark库:

    import org.apache.spark.streaming.{Seconds, StreamingContext}
    import org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe
    import org.apache.spark.streaming.kafka010.KafkaUtils
    import org.apache.spark.streaming.kafka010.LocationStrategies.PreferConsistent

  def startMetaInfoSubscriber(ssc: StreamingContext, kafkaParams: Map[String, Object], metaInfoTopic: String) {
    // Set a unique Kafka group identifier to metaInformationStream (each stream requires a unique group ID)
    val metaInformationKafkaParamas = kafkaParams ++ Map[String, Object]("group.id" -> RandomStringUtils.randomAlphabetic(10).toUpperCase)

    KafkaUtils.createDirectStream[String, String](
      ssc,
      PreferConsistent,
      Subscribe[String, String](metaInfoTopic, metaInformationKafkaParamas)
    ).foreachRDD(metaInfoRDD =>
      if (!metaInfoRDD.isEmpty()) {
        println("Saving MetaInformation")
        metaInfoRDD
//        metaInfoRDD.write.mode("append").format("com.databricks.spark.csv").save(s"hdfs://172.16.8.162:8020/user/sparkload/assetgroup/prueba-kafka")
      } else {
        println("There is not any message for topic 'tu-topic'")
      }
    )
  }
接下来是我的pom.xml:

<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<scala.version>2.11.8</scala.version>
<spark.version>2.3.0</spark.version>
<src.dir>src/main/scala</src.dir>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>${spark.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11 -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>${spark.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming_2.11</artifactId>
    <version>${spark.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka_2.11 -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
    <version>${spark.version}</version>
</dependency>
UTF-8
1.8
1.8
2.11.8
2.3.0
src/main/scala
org.apache.spark
spark-core_2.11
${spark.version}
org.apache.spark
spark-sql_2.11
${spark.version}
org.apache.spark
spark-U 2.11
${spark.version}
org.apache.spark
spark-streaming-kafka-0-10_2.11
${spark.version}
当我尝试编译时,出现了下一个错误:

[ERROR] C:\Users\agomez\Desktop\spark-base\spark-kafka-tfm\src\main\scala\spark_load\EjemploApp.scala:90: error: overloaded method value Subscribe with alternatives:
[ERROR]   (topics: java.util.Collection[String],kafkaParams: java.util.Map[String,Object])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String] <and>
[ERROR]   (topics: java.util.Collection[String],kafkaParams: java.util.Map[String,Object],offsets: java.util.Map[org.apache.kafka.common.TopicPartition,java.lang.Long])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String] <and>
[ERROR]   (topics: Iterable[String],kafkaParams: scala.collection.Map[String,Object])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String] <and>
[ERROR]   (topics: Iterable[String],kafkaParams: scala.collection.Map[String,Object],offsets: scala.collection.Map[org.apache.kafka.common.TopicPartition,scala.Long])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String]
[ERROR]  cannot be applied to (String, scala.collection.immutable.Map[String,Object])
[ERROR]       Subscribe[String, String](metaInfoTopic, metaInformationKafkaParamas)
[ERROR]                ^
[ERROR] one error found
[ERROR]C:\Users\agomez\Desktop\spark base\spark kafka-tfm\src\main\scala\spark\u-load\ejbempoapp.scala:90:ERROR:重载方法值订阅替代项:
[错误](主题:java.util.Collection[String],kafkaParams:java.util.Map[String,Object])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String]
[错误](主题:java.util.Collection[String],kafkaParams:java.util.Map[String,Object],偏移量:java.util.Map[org.apache.kafka.common.TopicPartition,java.lang.Long])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String]
[错误](主题:Iterable[String],kafkaParams:scala.collection.Map[String,Object])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String]
[错误](主题:Iterable[String],kafkaParams:scala.collection.Map[String,Object],偏移量:scala.collection.Map[org.apache.kafka.common.TopicPartition,scala.Long])org.apache.spark.streaming.kafka010.ConsumerStrategy[String,String]
无法将[错误]应用于(字符串、scala.collection.immutable.Map[字符串、对象])
[错误]订阅[字符串,字符串](元信息主题,元信息Kafkaparamas)
[错误]^
[错误]发现一个错误

我认为
Subscribe()
的第一个参数应该是主题集合

因此,您需要以
Seq[Strings]
Array[Strings]
的形式传递多个主题。 如果您只有一个主题,只需将其作为
Seq(metaInfoTopic)


希望这有帮助

只要您的项目编译和构建良好,就忽略intelliJ。当我尝试编译时,我出现了一个错误,因为订阅。我编辑的问题与我在尝试编译时遇到的错误相同,请再次检查该问题。
metaInformationKafkaParamas
是一个
映射
,您试图告诉编译器它是一个
字符串
KafkaUtils.createDirectStream[String, String](
    ssc,
    PreferConsistent,
    Subscribe[String, String](Seq(metaInfoTopic), metaInformationKafkaParamas)
)