在Scala中将BIGINT转换为时间戳

在Scala中将BIGINT转换为时间戳,scala,apache-spark-sql,Scala,Apache Spark Sql,在将BIGINT转换为TIMESTAMP时,垃圾值即将到来。请参见下面的查询。 谢谢你的帮助 scala> spark.sql("select cast(cast(cast(CAST('2015-11-15 18:15:06.51' AS TIMESTAMP) as double)*1000 + cast('64082' as double) as bigint) as timestamp) " ).show(truncate=false) +-----------------------

在将BIGINT转换为TIMESTAMP时,垃圾值即将到来。请参见下面的查询。 谢谢你的帮助

scala> spark.sql("select cast(cast(cast(CAST('2015-11-15 18:15:06.51' AS TIMESTAMP) as double)*1000 + cast('64082' as double) as bigint) as timestamp) " ).show(truncate=false)
+-----------------------------------------------------------------------------------------------------------------------------------------------+
|CAST(CAST(((CAST(CAST(2015-11-15 18:15:06.51 AS TIMESTAMP) AS DOUBLE) * CAST(1000 AS DOUBLE)) + CAST(64082 AS DOUBLE)) AS BIGINT) AS TIMESTAMP)|
+-----------------------------------------------------------------------------------------------------------------------------------------------+
|47843-07-20 09:36:32.0                                                                                                                         |
+-----------------------------------------------------------------------------------------------------------------------------------------------+
使用Spark 1.6

将时间戳类型转换为DOUBLE将转换为秒,因为 1970-01-01. 将BIGINT类型转换为时间戳将从 自1970年1月1日起的秒数。 您的示例似乎暗示您认为从1970-01-01开始将BIGINT强制转换为TIMESTAMP会从毫秒转换为毫秒,但事实并非如此。所以你最终得到了一个垃圾价值


请注意,根据此票据,行为实际上是可配置的:

将bigint转换为timestamp将以秒为单位接收历元时间。 尝试不乘以1000:

select cast(cast(cast(CAST('2015-11-15 18:15:06.51' AS TIMESTAMP) as double) + cast('64082' as double) as bigint) as timestamp)

虽然这会降低毫秒的精度

为什么结果是垃圾值?你期望得到什么结果?更重要的是,通过非常简单的代码示例,您做了什么来证明预期结果实际上是合理的预期?