Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 运行spark提交的问题:java.lang.NoSuchMethodError:com.couchbase.spark.streaming.Mutation.key()_Scala_Apache Spark_Sbt_Spark Submit - Fatal编程技术网

Scala 运行spark提交的问题:java.lang.NoSuchMethodError:com.couchbase.spark.streaming.Mutation.key()

Scala 运行spark提交的问题:java.lang.NoSuchMethodError:com.couchbase.spark.streaming.Mutation.key(),scala,apache-spark,sbt,spark-submit,Scala,Apache Spark,Sbt,Spark Submit,我有下面的scala代码,正在使用sbt编译和运行它。sbt运行正常 import org.apache.spark.SparkConf import org.apache.spark.streaming.{StreamingContext, Seconds} import com.couchbase.spark.streaming._ object StreamingExample { def main(args: Array[String]): Unit = { // C

我有下面的scala代码,正在使用sbt编译和运行它。sbt运行正常

import org.apache.spark.SparkConf
import org.apache.spark.streaming.{StreamingContext, Seconds}
import com.couchbase.spark.streaming._


object StreamingExample {

  def main(args: Array[String]): Unit = {

    // Create the Spark Config and instruct to use the travel-sample bucket
    // with no password.
    val conf = new SparkConf()
      .setMaster("local[*]")
      .setAppName("StreamingExample")
      .set("com.couchbase.bucket.travel-sample", "")

    // Initialize StreamingContext with a Batch interval of 5 seconds
    val ssc = new StreamingContext(conf, Seconds(5))

    // Consume the DCP Stream from the beginning and never stop.
    // This counts the messages per interval and prints their count.
    ssc
      .couchbaseStream(from = FromBeginning, to = ToInfinity)
        .foreachRDD(rdd => {
          rdd.foreach(message => {
            //println(message.getClass());
            message.getClass();
            if(message.isInstanceOf[Mutation]) {
              val document = message.asInstanceOf[Mutation].key.map(_.toChar).mkString
              println("mutated: " +  document);
            } else if( message.isInstanceOf[Deletion]) {
              val document = message.asInstanceOf[Deletion].key.map(_.toChar).mkString
              println("deleted: " + document);
            }
          })
        })

    // Start the Stream and await termination
    ssc.start()
    ssc.awaitTermination()
  }
}
但当作为spark作业运行时,此操作失败,如下所示: spark提交-类流化示例-主本地[*]目标/scala-2.11/spark-samples_2.11-1.0.jar

错误是java.lang.NoSuchMethodError:com.couchbase.spark.streaming.Mutation.key

下面是我的build.sbt

lazy val root = (project in file(".")).
  settings(
    name := "spark-samples",
    version := "1.0",
    scalaVersion := "2.11.12",
    mainClass in Compile := Some("StreamingExample")        
  )

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.4.0",
  "org.apache.spark" %% "spark-streaming" % "2.4.0",
  "org.apache.spark" %% "spark-sql" % "2.4.0",
  "com.couchbase.client" %% "spark-connector" % "2.2.0"
)

// META-INF discarding
assemblyMergeStrategy in assembly := {
       case PathList("META-INF", xs @ _*) => MergeStrategy.discard
       case x => MergeStrategy.first
   } 
在我的机器上运行的spark版本是使用scala 2.11.12的2.4.0

意见:

我在spark jars/usr/local/ceral/apache spark/2.4.0/libexec/jars中没有看到com.couchbase.client\u spark-connector\u 2.11-2.2.0,但存在较旧版本的com.couchbase.client\u spark-connector\u 2.10-1.2.0.jar

为什么spark不起作用? sbt是如何运作的?它在哪里下载 依赖关系?
请确保SBT使用的Scala版本和spark connector library版本与spark安装相同


当我试图在我的系统上运行一个示例Flink作业时,我遇到了类似的问题。这是由版本不匹配造成的。

如何检查SBT和spark安装使用的spark connector library版本是否相同?