Apache spark 带cassandra可处理spark cassandra的左侧连接上的sbt程序包错误

Apache spark 带cassandra可处理spark cassandra的左侧连接上的sbt程序包错误,apache-spark,sbt,spark-cassandra-connector,Apache Spark,Sbt,Spark Cassandra Connector,spark cassandra函数LeftJoin with Cassandratable在spark shell中运行良好,但在sbt中打包时,我遇到了下面提到的错误 我的scala代码片段 case class time_struct(id: String, month: Int, day: Int, platform: String, type_1: String, title:String, time: Long) val rdd = time

spark cassandra函数LeftJoin with Cassandratable在spark shell中运行良好,但在sbt中打包时,我遇到了下面提到的错误

我的scala代码片段

case class time_struct(id: String, month: Int, day: Int, platform: String, type_1: String, title:String,
                      time: Long)
val rdd = time_data.mapPartitions( data =>   
          data.map( row =>
              time_struct(row.getString(0),row.getInt(1),row.getInt(2),row.getString(3),row.getString(4),row.getString(5),row.getDouble(6).toLong)))
val join = rdd.leftJoinWithCassandraTable("ks1", "tbl1",SomeColumns("time" ),
           SomeColumns("id"))
$sbt套餐

[error] Test.scala:187: could not find implicit value for parameter rwf: com.datastax.spark.connector.writer.RowWriterFactory[time_struct]
[error]     val join = rdd.leftJoinWithCassandraTable("ks1", "tbl1",SomeColumns("time" ),
[error]                                              ^
build.sbt

标度厌恶度:=2.11.8

scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8")

libraryDependencies ++= {
  val sparkV = "2.1.0"

  Seq(
    "org.apache.spark" %% "spark-core" % sparkV % "provided",
    "org.apache.spark" %% "spark-sql" % sparkV % "provided",

    "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.0-RC1",
    "com.databricks" %% "spark-csv" % "1.5.0"
)
}

libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.1"
环境:

scala 2.11.8

spark 2.1.0


sbt 0.13.13

我犯了一个错误,我在主函数中声明了case类。 在我将case类定义移出main函数之后,它成功地编译了