Apache spark Spark cassandra与Spark cassandra接头集成时出错
我正试图从spark以独立模式在cassandra中保存数据。通过运行以下命令:Apache spark Spark cassandra与Spark cassandra接头集成时出错,apache-spark,cassandra,sbt,spark-cassandra-connector,Apache Spark,Cassandra,Sbt,Spark Cassandra Connector,我正试图从spark以独立模式在cassandra中保存数据。通过运行以下命令: bin/spark-submit --packages datastax:spark-cassandra-connector:1.6.0-s_2.10 --class "pl.japila.spark.SparkMeApp" --master local /home/hduser2/code14/target/scala-2.10/simple-project_2.10-1.0.jar 我的bui
bin/spark-submit --packages datastax:spark-cassandra-connector:1.6.0-s_2.10
--class "pl.japila.spark.SparkMeApp" --master local /home/hduser2/code14/target/scala-2.10/simple-project_2.10-1.0.jar
我的build.sbt文件是:-
**name := "Simple Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.6.0"
resolvers += "Spark Packages Repo" at "https://dl.bintray.com/spark-packages/maven"
libraryDependencies += "datastax" % "spark-cassandra-connector" % "1.6.0-s_2.10"
libraryDependencies ++= Seq(
"org.apache.cassandra" % "cassandra-thrift" % "3.5" ,
"org.apache.cassandra" % "cassandra-clientutil" % "3.5",
"com.datastax.cassandra" % "cassandra-driver-core" % "3.0.0"
)**
我的Spark代码是:-
package pl.japila.spark
import org.apache.spark.sql._
import com.datastax.spark.connector._
import com.datastax.driver.core._
import com.datastax.spark.connector.cql._
import org.apache.spark.{SparkContext, SparkConf}
import com.datastax.driver.core.QueryOptions._
import org.apache.spark.SparkConf
import com.datastax.driver.core._
import com.datastax.spark.connector.rdd._
object SparkMeApp {
def main(args: Array[String]) {
val conf = new SparkConf(true).set("spark.cassandra.connection.host", "127.0.0.1")
val sc = new SparkContext("local", "test", conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val rdd = sc.cassandraTable("test", "kv")
val collection = sc.parallelize(Seq(("cat", 30), ("fox", 40)))
collection.saveToCassandra("test", "kv", SomeColumns("key", "value"))
}
}
我得到了这个错误:-
线程“main”java.lang.NoSuchMethodError中出现异常:com.datastax.driver.core.QueryOptions.setRefreshNodeIntervalMillis(I)Lcom/datastax/driver/core/QueryOptions**
在com.datastax.spark.connector.cql.DefaultConnectionFactory$.clusterBuilder上(CassandraConnectionFactory.scala:49)
在com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:92)
在com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:153)
在com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply上(CassandraConnector.scala:148)
在com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply上(CassandraConnector.scala:148)
在com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)上
位于com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
在com.datasax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
在com.datasax.spark.connector.cql.CassandraConnector.withSessionDo上(CassandraConnector.scala:109)
使用的版本为:-Spark-1.6.0
Scala-2.10.4
卡桑德拉驱动核心jar-3.0.0
卡桑德拉2.2.7版
火花卡桑德拉连接器-1.6.0-s_2.10
谁来帮忙 我将从删除
libraryDependencies ++= Seq(
"org.apache.cassandra" % "cassandra-thrift" % "3.5" ,
"org.apache.cassandra" % "cassandra-clientutil" % "3.5",
"com.datastax.cassandra" % "cassandra-driver-core" % "3.0.0"
)
因为作为连接器依赖项的库将自动包含在包依赖项中
然后,我将通过使用
./bin/spark-shell --packages datastax:spark-cassandra-connector:1.6.0-s_2.10
您可以正确地看到以下解决方案
datastax#spark-cassandra-connector added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found datastax#spark-cassandra-connector;1.6.0-s_2.10 in spark-packages
found org.apache.cassandra#cassandra-clientutil;3.0.2 in list
found com.datastax.cassandra#cassandra-driver-core;3.0.0 in list
...
[2.10.5] org.scala-lang#scala-reflect;2.10.5
:: resolution report :: resolve 627ms :: artifacts dl 10ms
:: modules in use:
com.datastax.cassandra#cassandra-driver-core;3.0.0 from list in [default]
com.google.guava#guava;16.0.1 from list in [default]
com.twitter#jsr166e;1.1.0 from list in [default]
datastax#spark-cassandra-connector;1.6.0-s_2.10 from spark-packages in [default]
...
如果这些问题看起来解决得很好,但仍然无法解决,我会尝试清除这些工件的缓存。非常感谢Russ,我的问题已经解决了。这是缓存问题。你介意解释一下这与什么类型的缓存有关,以及你是如何清理的吗?我也遇到了同样的问题。对于sbt,它是~/.ivy2/cache/maven是~/.m2/repository/您可以删除这些目录的内容,或者只删除它说找不到的元素。通常这些目录会存在,但会丢失一些文件。谢谢,我的问题是使用了错误版本的Cassandra Java驱动程序,我按照这个矩阵解决了问题。