Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/video/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala Spark会话目录失败_Scala_Apache Spark_Apache Spark Sql_Azure Eventhub_Spark Cassandra Connector - Fatal编程技术网

Scala Spark会话目录失败

Scala Spark会话目录失败,scala,apache-spark,apache-spark-sql,azure-eventhub,spark-cassandra-connector,Scala,Apache Spark,Apache Spark Sql,Azure Eventhub,Spark Cassandra Connector,我正在从Cassandra数据库中批量读取数据,也在使用Scala Spark API从Azure EventHubs流式传输数据 session.read .format("org.apache.spark.sql.cassandra") .option("keyspace", keyspace) .option("table", table) .option("pushdown", pushdown) .load() & 一切都很好,但现在我什么地方也找不到这个例外 U

我正在从Cassandra数据库中批量读取数据,也在使用Scala Spark API从Azure EventHubs流式传输数据

session.read
  .format("org.apache.spark.sql.cassandra")
  .option("keyspace", keyspace)
  .option("table", table)
  .option("pushdown", pushdown)
  .load()
&

一切都很好,但现在我什么地方也找不到这个例外

User class threw exception: java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init>(Lscala/Function0;Lscala/Function0;Lorg/apache/spark/sql/catalyst/analysis/FunctionRegistry;Lorg/apache/spark/sql/internal/SQLConf;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/spark/sql/catalyst/parser/ParserInterface;Lorg/apache/spark/sql/catalyst/catalog/FunctionResourceLoader;)V
at org.apache.spark.sql.internal.BaseSessionStateBuilder.catalog$lzycompute(BaseSessionStateBuilder.scala:132)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.catalog(BaseSessionStateBuilder.scala:131)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anon$1.<init>(BaseSessionStateBuilder.scala:157)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.analyzer(BaseSessionStateBuilder.scala:157)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:428)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:233)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)

有人有线索吗?

java.lang.NoSuchMethodError:org.apache.spark.sql.catalyst.catalog.SessionCatalog。我仍然不明白为什么,但重新安装spark2客户端已经解决了这个问题。我在
2.3.0
集群上运行spark
2.4.0
作业,方法是将库传递给
--jars
参数中的
spark submit
cli。
   java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init(  
   scala/Function0;Lscala/Function0; 
   Lorg/apache/spark/sql/catalyst/analysis/FunctionRegistry;
   Lorg/apache/spark/sql/internal/SQLConf;
   Lorg/apache/hadoop/conf/Configuration;
   Lorg/apache/spark/sql/catalyst/parser/ParserInterface;
   Lorg/apache/spark/sql/catalyst/catalog/FunctionResourceLoader;)
ThisBuild / scalaVersion := "2.11.11"
val sparkVersion = "2.4.0"

libraryDependencies ++= Seq(
  "org.apache.logging.log4j" % "log4j-core" % "2.11.1",
  "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-sql" % sparkVersion  % "provided",
  "org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-catalyst" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
  "com.microsoft.azure" % "azure-eventhubs-spark_2.11" % "2.3.10",
  "com.microsoft.azure" % "azure-eventhubs" % "2.3.0",
  "com.datastax.spark" %% "spark-cassandra-connector" % "2.4.1",
  "org.scala-lang.modules" %% "scala-java8-compat" % "0.9.0",
  "com.twitter" % "jsr166e" % "1.1.0",
  "com.holdenkarau" %% "spark-testing-base" % "2.4.0_0.12.0" % Test,
  "MrPowers" % "spark-fast-tests" % "0.19.2-s_2.11" % Test
)
   java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init(  
   scala/Function0;Lscala/Function0; 
   Lorg/apache/spark/sql/catalyst/analysis/FunctionRegistry;
   Lorg/apache/spark/sql/internal/SQLConf;
   Lorg/apache/hadoop/conf/Configuration;
   Lorg/apache/spark/sql/catalyst/parser/ParserInterface;
   Lorg/apache/spark/sql/catalyst/catalog/FunctionResourceLoader;)