Apache spark 找不到LoginModule类:org.apache.kafka.common.security.plain.PlainLoginModule

Apache spark 找不到LoginModule类:org.apache.kafka.common.security.plain.PlainLoginModule,apache-spark,apache-kafka,spark-structured-streaming,spark-streaming-kafka,Apache Spark,Apache Kafka,Spark Structured Streaming,Spark Streaming Kafka,环境:Spark 2.3.0、Scala 2.11.12、Kafka(无论最新版本是什么) 我有一个安全的卡夫卡系统,我正试图连接我的Spark流媒体消费者。下面是我的build.sbt文件: name := "kafka-streaming" version := "1.0" scalaVersion := "2.11.12" // still want to be able to run in sbt // https://github.com/sbt/sbt-assembly#-pro

环境:Spark 2.3.0、Scala 2.11.12、Kafka(无论最新版本是什么)

我有一个安全的卡夫卡系统,我正试图连接我的Spark流媒体消费者。下面是我的
build.sbt
文件:

name := "kafka-streaming"
version := "1.0"

scalaVersion := "2.11.12"

// still want to be able to run in sbt
// https://github.com/sbt/sbt-assembly#-provided-configuration
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))

fork in run := true
javaOptions in run ++= Seq(
    "-Dlog4j.debug=true",
    "-Dlog4j.configuration=log4j.properties")

assemblyMergeStrategy in assembly := {
    case "META-INF/services/org.apache.spark.sql.sources.DataSourceRegister" => MergeStrategy.concat
    case PathList("META-INF", _*) => MergeStrategy.discard
    case _ => MergeStrategy.first
}

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "2.3.0",
    "org.apache.spark" %% "spark-sql" % "2.3.0",
    "org.apache.spark" %% "spark-streaming" % "2.3.0",
    "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.3.0",
    "org.apache.spark" %% "spark-sql-kafka-0-10" % "2.3.0",
    "com.ibm.db2.jcc" % "db2jcc" % "db2jcc4"
)
name:=“卡夫卡流媒体”
版本:=“1.0”
规模厌恶:=“2.11.12”
//还想在sbt中跑步吗
// https://github.com/sbt/sbt-assembly#-提供的配置
在编译MergeStrategy.discard中运行
案例=>MergeStrategy.first
}
libraryDependencies++=Seq(
“org.apache.spark”%%“spark核心”%%“2.3.0”,
“org.apache.spark”%%“spark sql”%%“2.3.0”,
“org.apache.spark”%%“spark流媒体”%%“2.3.0”,
“org.apache.spark”%%“spark-streaming-kafka-0-10”%%“2.3.0”,
“org.apache.spark”%%“spark-sql-kafka-0-10”%%“2.3.0”,
“com.ibm.db2.jcc”%“db2jcc”%“db2jcc4”
)
请注意,这是Spark 2.3.0,我无法更改Spark版本

下面是我尝试将Spark流媒体消费者连接到安全卡夫卡的代码部分:

val df=spark.readStream
.格式(“卡夫卡”)
.选项(“订阅”、“原始天气”)
.option(“kafka.bootstrap.servers”、“s”)
.option(“kafka.security.protocol”,“SASL_SSL”)
.选项(“卡夫卡·萨斯勒机制”、“普通”)
.option(“kafka.sasl.jaas.config”,“org.apache.kafka.common.security.plain.PlainLoginModule所需用户名=\“用户\密码=\”“+”密码+“\”;”)
.选项(“kafka.ssl.protocol”、“TLSv1.2”)
.option(“kafka.ssl.enabled.protocols”,“TLSv1.2”)
.option(“kafka.ssl.endpoint.identification.algorithm”,“HTTPS”)
.load()
当我尝试运行此程序时,抛出以下错误:

Exception in thread "main" org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
    at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:702)
    at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:557)
    at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:540)
    at org.apache.spark.sql.kafka010.SubscribeStrategy.createConsumer(ConsumerStrategy.scala:62)
    at org.apache.spark.sql.kafka010.KafkaOffsetReader.createConsumer(KafkaOffsetReader.scala:314)
    at org.apache.spark.sql.kafka010.KafkaOffsetReader.<init>(KafkaOffsetReader.scala:78)
    at org.apache.spark.sql.kafka010.KafkaSourceProvider.createContinuousReader(KafkaSourceProvider.scala:130)
    at org.apache.spark.sql.kafka010.KafkaSourceProvider.createContinuousReader(KafkaSourceProvider.scala:43)
    at org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:185)
    >> at com.ibm.kafkasparkintegration.executables.WeatherDataStream$.getRawDataFrame(WeatherDataStream.scala:74)
    at com.ibm.kafkasparkintegration.executables.WeatherDataStream$.main(WeatherDataStream.scala:24)
    at com.ibm.kafkasparkintegration.executables.WeatherDataStream.main(WeatherDataStream.scala)
Caused by: org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: unable to find LoginModule class:  org.apache.kafka.common.security.plain.PlainLoginModule
    at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:86)
    at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:70)
    at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:83)
    at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:623)
    ... 11 more
Caused by: javax.security.auth.login.LoginException: unable to find LoginModule class:  org.apache.kafka.common.security.plain.PlainLoginModule
    at javax.security.auth.login.LoginContext.invoke(LoginContext.java:794)
    at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
    at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
    at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
    at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
    at org.apache.kafka.common.security.authenticator.AbstractLogin.login(AbstractLogin.java:69)
    at org.apache.kafka.common.security.authenticator.LoginManager.<init>(LoginManager.java:46)
    at org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:68)
    at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:78)
    ... 14 more
线程“main”org.apache.kafka.common.KafkaException中出现异常:无法构造kafka使用者 位于org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:702) 位于org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:557) 位于org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:540) 在org.apache.spark.sql.kafka010.subscribbeststrategy.createConsumer(ConsumerStrategy.scala:62) 位于org.apache.spark.sql.kafka010.KafkaOffsetReader.createConsumer(KafkaOffsetReader.scala:314) 位于org.apache.spark.sql.kafka010.KafkaOffsetReader(KafkaOffsetReader.scala:78) 位于org.apache.spark.sql.kafka010.KafkaSourceProvider.createContinuousReader(KafkaSourceProvider.scala:130) 位于org.apache.spark.sql.kafka010.KafkaSourceProvider.createContinuousReader(KafkaSourceProvider.scala:43) 位于org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:185) >>在com.ibm.kafkaspakinkintegration.executables.WeatherDataStream$.getRawDataFrame(WeatherDataStream.scala:74)上 在com.ibm.kafkaspakinkintegration.executables.WeatherDataStream$.main(WeatherDataStream.scala:24)上 位于com.ibm.kafkaspakinkintegration.executables.WeatherDataStream.main(WeatherDataStream.scala) 原因:org.apache.kafka.common.KafkaException:javax.security.auth.login.login异常:找不到LoginModule类:org.apache.kafka.common.security.plain.PlainLoginModule 位于org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:86) 位于org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:70) 位于org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:83) 位于org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623) ... 还有11个 原因:javax.security.auth.login.login异常:找不到LoginModule类:org.apache.kafka.common.security.plain.PlainLoginModule 位于javax.security.auth.login.LoginContext.invoke(LoginContext.java:794) 位于javax.security.auth.login.LoginContext.access$000(LoginContext.java:195) 位于javax.security.auth.login.LoginContext$4.run(LoginContext.java:682) 位于javax.security.auth.login.LoginContext$4.run(LoginContext.java:680) 位于java.security.AccessController.doPrivileged(本机方法) 位于javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680) 位于javax.security.auth.login.LoginContext.login(LoginContext.java:587) 位于org.apache.kafka.common.security.authenticator.AbstractLogin.login(AbstractLogin.java:69) 位于org.apache.kafka.common.security.authenticator.LoginManager.(LoginManager.java:46) 位于org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:68) 位于org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:78) ... 14多 错误日志中的
>
指向上面代码段中的
load()
。几天来,我一直在努力解决这个问题,但没有取得多大成功