Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
使用Spark Scala向Cassandra插入时间戳_Scala_Apache Spark_Cassandra - Fatal编程技术网

使用Spark Scala向Cassandra插入时间戳

使用Spark Scala向Cassandra插入时间戳,scala,apache-spark,cassandra,Scala,Apache Spark,Cassandra,我试图读取一个包含名称的文件,并使用Spark和Scala将名称和时间戳数据一起插入cassandra表。下面是我的代码 case class Names(name:String, auditDate:DateTime ) def main(args: Array[String]): Unit = { System.setProperty("hadoop.home.dir", "D:\\backup\\lib\\winutils"); val conf = new SparkC

我试图读取一个包含名称的文件,并使用Spark和Scala将名称和时间戳数据一起插入cassandra表。下面是我的代码

case class Names(name:String, auditDate:DateTime )

def main(args: Array[String]): Unit = {
    System.setProperty("hadoop.home.dir", "D:\\backup\\lib\\winutils");
    val conf = new SparkConf()
      .set("spark.cassandra.connection.host", "172.16.109.202")
      //.set("spark.cassandra.connection.host", "192.168.1.17")
      .setAppName("CassandraLoader")
      .setMaster("local")
    var context = new SparkContext(conf)

    var namesFile = context.textFile("src/main/resources/names.txt")

    namesFile.map(x=>Names(x,DateTime.now()))
      .saveToCassandra("practice","names",SomeColumns("name", "insert_date"))

  }
卡桑德拉表的详细信息如下所示

CREATE TABLE practice.names (
    name text PRIMARY KEY,
    insert_date timestamp
)
当我试图执行代码时,我得到以下错误

Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Columns not found in com.sample.practice.Names: [insert_date]
    at scala.Predef$.require(Predef.scala:233)
    at com.datastax.spark.connector.mapper.DefaultColumnMapper.columnMapForWriting(DefaultColumnMapper.scala:108)
    at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.<init>(MappedToGettableDataConverter.scala:29)
    at com.datastax.spark.connector.writer.MappedToGettableDataConverter$.apply(MappedToGettableDataConverter.scala:20)
    at com.datastax.spark.connector.writer.DefaultRowWriter.<init>(DefaultRowWriter.scala:17)
    at com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:31)
    at com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:29)
    at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:271)
    at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
    at com.sample.practice.CqlInsertDate$.main(CqlInsertDate.scala:30)
    at com.sample.practice.CqlInsertDate.main(CqlInsertDate.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
下面是我的SBT文件详细信息

version := "1.0"

scalaVersion := "2.10.6"

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.10" % "2.0.0-M3"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.2"

libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "2.0.2"

libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "2.0.2"

我使用的是卡桑德拉2.1。请帮忙。提前感谢。

尝试将类字段更改为插入日期,或将表列更改为插入日期

尝试将类字段更改为插入日期,或将表列更改为插入日期

@FaigB,这里的逻辑是什么?我面临着同样的问题,该怎么办?这里有什么把戏?@FaigB,这里的逻辑是什么?我面临着同样的问题,该怎么办?这里的诀窍是什么?
version := "1.0"

scalaVersion := "2.10.6"

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.10" % "2.0.0-M3"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.2"

libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "2.0.2"

libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "2.0.2"