Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala spark cassandra连接器2.0.2的sbt未解析依赖项_Scala_Apache Spark_Sbt_Spark Cassandra Connector - Fatal编程技术网

Scala spark cassandra连接器2.0.2的sbt未解析依赖项

Scala spark cassandra连接器2.0.2的sbt未解析依赖项,scala,apache-spark,sbt,spark-cassandra-connector,Scala,Apache Spark,Sbt,Spark Cassandra Connector,build.sbt: val sparkVersion = "2.1.1"; libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided"; libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided"; libraryDependencies += "org.apache.sp

build.sbt:

val sparkVersion = "2.1.1";

libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided";

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion;
输出:

[error] (myproject/*:update) sbt.ResolveException: unresolved dependency: com.datastax.spark#spark-cassandra-connector;2.0.2: not found
有什么想法吗?我是sbt和spark的新手。感谢

这是由com.datastax.spark%spark cassandra连接器%2.0.2;如果没有scala版本,请参见maven repo:

有两种解决方案:

com.datastax.spark%spark-cassandra-connector_2.11%2.0.2为依赖项显式设置Scala版本 com.datastax.spark%%spark cassandra连接器%2.0.2,使用%%和工件id,这样,SBT将自动基于项目的scala版本扩展到解决方案1。
谢谢我将其更改为libraryDependencies+=com.datastax.spark%%spark cassandra连接器%2.0.2;libraryDependencies+=org.apache.spark%spark-streaming-kafka-0-10_2.11%sparkVersion;现在,错误是“org.apache.spark:spark标签中的冲突跨版本后缀”`有什么想法吗?@BAE您尝试过Chegpohi建议的第一种方法吗?