Scala spark cassandra连接器2.0.2的sbt未解析依赖项
build.sbt:Scala spark cassandra连接器2.0.2的sbt未解析依赖项,scala,apache-spark,sbt,spark-cassandra-connector,Scala,Apache Spark,Sbt,Spark Cassandra Connector,build.sbt: val sparkVersion = "2.1.1"; libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided"; libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided"; libraryDependencies += "org.apache.sp
val sparkVersion = "2.1.1";
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided";
libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion;
输出:
[error] (myproject/*:update) sbt.ResolveException: unresolved dependency: com.datastax.spark#spark-cassandra-connector;2.0.2: not found
有什么想法吗?我是sbt和spark的新手。感谢这是由com.datastax.spark%spark cassandra连接器%2.0.2;如果没有scala版本,请参见maven repo:
有两种解决方案:
com.datastax.spark%spark-cassandra-connector_2.11%2.0.2为依赖项显式设置Scala版本
com.datastax.spark%%spark cassandra连接器%2.0.2,使用%%和工件id,这样,SBT将自动基于项目的scala版本扩展到解决方案1。
谢谢我将其更改为libraryDependencies+=com.datastax.spark%%spark cassandra连接器%2.0.2;libraryDependencies+=org.apache.spark%spark-streaming-kafka-0-10_2.11%sparkVersion;现在,错误是“org.apache.spark:spark标签中的冲突跨版本后缀”`有什么想法吗?@BAE您尝试过Chegpohi建议的第一种方法吗?