Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/date/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala Spark:在dataframe中将双列转换为日期时间列_Scala_Date_Apache Spark - Fatal编程技术网

Scala Spark:在dataframe中将双列转换为日期时间列

Scala Spark:在dataframe中将双列转换为日期时间列,scala,date,apache-spark,Scala,Date,Apache Spark,我正在尝试编写代码,将日期时间列date和last_updated_date转换为“mm-dd-yyy”格式以显示,这实际上是unix时间的双精度转换。我该怎么做 import org.joda.time._ import scala.tools._ import org.joda.time.format.DateTimeFormat._ import java.text.SimpleDateFormat import org.apache.spark.sql.functions.{unix_t

我正在尝试编写代码,将日期时间列date和last_updated_date转换为“mm-dd-yyy”格式以显示,这实际上是unix时间的双精度转换。我该怎么做

import org.joda.time._
import scala.tools._
import org.joda.time.format.DateTimeFormat._
import java.text.SimpleDateFormat
import org.apache.spark.sql.functions.{unix_timestamp, to_date}
root
 |-- date: double (nullable = false)
 |-- last_updated_date: double (nullable = false)
 |-- Percent_Used: double (nullable = false)

+------------+---------------------+------------+
|        date|    last_updated_date|Percent_Used|
+------------+---------------------+------------+
| 1.453923E12|        1.47080394E12| 1.948327124|
|1.4539233E12|        1.47080394E12| 2.019636442|
|1.4539236E12|        1.47080394E12| 1.995299371|
+------------+---------------------+------------+
转换为时间戳:

df.select(col("date").cast("timestamp"));

使用以下命令将其转换为时间戳:

这个答案对我来说很有用,试一试,其实这是一个秒的计算

导入日期时间
串行=43822.597222222
秒=(串行-25569)*86400.0
打印(datetime.datetime.utcfromtimestamp(秒))


嗯,我认为这有点问题,因为日期“长”,这会导致直接执行此操作时日期不正常。你明白了吗?如果是,你能帮我回答吗
df.select(from_unixtime("date").as("date"))