Apache spark 找不到';卡夫卡客户端';JAAS配置中的条目。系统属性';java.security.auth.login.config';未设置

Apache spark 找不到';卡夫卡客户端';JAAS配置中的条目。系统属性';java.security.auth.login.config';未设置,apache-spark,apache-kafka,jaas,spark-structured-streaming,Apache Spark,Apache Kafka,Jaas,Spark Structured Streaming,我正试图从spark结构化流媒体连接到卡夫卡 这项工作: spark shell——主本地[1]\ --文件/mypath/jaas_mh.conf\ --packagesorg.apache.spark:spark-sql-kafka-0-10_2.11:2.3.0\ --conf“spark.driver.extraJavaOptions=-Djava.security.auth.login.config=jaas_mh.conf”\ --conf“spark.executor.extraJ

我正试图从spark结构化流媒体连接到卡夫卡

这项工作:

spark shell——主本地[1]\
--文件/mypath/jaas_mh.conf\
--packagesorg.apache.spark:spark-sql-kafka-0-10_2.11:2.3.0\
--conf“spark.driver.extraJavaOptions=-Djava.security.auth.login.config=jaas_mh.conf”\
--conf“spark.executor.extraJavaOptions=-Djava.security.auth.login.config=jaas_mh.conf”\
--num executors 1--执行器核心1
但是,当我尝试以编程方式执行相同操作时

objectsparkhelper{
def Get和ConfigureSparkSession()={
val conf=new SparkConf()
.setAppName(“从消息中心到Cassandra的结构化流媒体”)
.setMaster(“本地[1]”)
.set(“spark.driver.extraJavaOptions”,“-Djava.security.auth.login.config=jaas_mh.conf”)
.set(“spark.executor.extraJavaOptions”,“-Djava.security.auth.login.config=jaas_mh.conf”)
val sc=新的SparkContext(配置)
sc.setLogLevel(“警告”)
getSparkSession()
}
def getSparkSession():SparkSession={
val spark=火花会话
.builder()
.getOrCreate()
spark.sparkContext.addFile(“/mypath/jaas_mh.conf”)
返回火花
}
}
我得到一个错误:

 Could not find a 'KafkaClient' entry in the JAAS configuration. 
    System property 'java.security.auth.login.config' is not set

任何指针?

即使在conf中,也应该为.conf文件提供完整路径或相对路径。 另外,当您创建SparkConf时,我发现您没有将其应用于当前SparkSession

import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession

object Driver extends App {

  val confPath: String = "/Users/arcizon/IdeaProjects/spark/src/main/resources/jaas_mh.conf"

  def getAndConfigureSparkSession(): SparkSession = {
    val conf = new SparkConf()
      .setAppName("Structured Streaming from Message Hub to Cassandra")
      .setMaster("local[1]")
      .set("spark.driver.extraJavaOptions", s"-Djava.security.auth.login.config=$confPath")
      .set("spark.executor.extraJavaOptions", s"-Djava.security.auth.login.config=$confPath")

    getSparkSession(conf)
  }

  def getSparkSession(conf: SparkConf): SparkSession = {
    val spark = SparkSession
      .builder()
      .config(conf)
      .getOrCreate()

    spark.sparkContext.addFile(confPath)

    spark.sparkContext.setLogLevel("WARN")

    spark
  }

  val sparkSession: SparkSession = getAndConfigureSparkSession()

  println(sparkSession.conf.get("spark.driver.extraJavaOptions"))
  println(sparkSession.conf.get("spark.executor.extraJavaOptions"))

  sparkSession.stop()
}

即使在conf中,也应该为.conf文件提供完整路径或相对路径。 另外,当您创建SparkConf时,我发现您没有将其应用于当前SparkSession

import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession

object Driver extends App {

  val confPath: String = "/Users/arcizon/IdeaProjects/spark/src/main/resources/jaas_mh.conf"

  def getAndConfigureSparkSession(): SparkSession = {
    val conf = new SparkConf()
      .setAppName("Structured Streaming from Message Hub to Cassandra")
      .setMaster("local[1]")
      .set("spark.driver.extraJavaOptions", s"-Djava.security.auth.login.config=$confPath")
      .set("spark.executor.extraJavaOptions", s"-Djava.security.auth.login.config=$confPath")

    getSparkSession(conf)
  }

  def getSparkSession(conf: SparkConf): SparkSession = {
    val spark = SparkSession
      .builder()
      .config(conf)
      .getOrCreate()

    spark.sparkContext.addFile(confPath)

    spark.sparkContext.setLogLevel("WARN")

    spark
  }

  val sparkSession: SparkSession = getAndConfigureSparkSession()

  println(sparkSession.conf.get("spark.driver.extraJavaOptions"))
  println(sparkSession.conf.get("spark.executor.extraJavaOptions"))

  sparkSession.stop()
}

我还必须设置为系统属性,因为我在客户端模式下运行:
system.setProperty(“java.security.auth.login.config”、“jaas_mh.conf”)
我还必须设置为系统属性,因为我在客户端模式下运行:
system.setProperty(“java.security.auth.login.config”、“jaas_mh.conf”)