Pyspark 将数据传输到红移时从aws glue转换时间戳

Pyspark 将数据传输到红移时从aws glue转换时间戳,pyspark,amazon-redshift,aws-glue,Pyspark,Amazon Redshift,Aws Glue,我在S3中有一个文件,我们正在使用胶水将其导入redshift。 爬虫部分完成了 其中一列数据是datetime类型,但格式不正确,因此clawler无法识别并将其标记为string 现在我已经在redshift中创建了表,并将列数据类型标记为timestamp,现在在创建作业时,需要在脚本中更改位置和内容,以便将字符串转换为redshift timestamp S3文件中的日期格式为“yyyy.mm.dd HH:mi:ss” 下面是脚本 import sys from awsglue.tran

我在S3中有一个文件,我们正在使用胶水将其导入redshift。 爬虫部分完成了

其中一列数据是datetime类型,但格式不正确,因此clawler无法识别并将其标记为string

现在我已经在redshift中创建了表,并将列数据类型标记为timestamp,现在在创建作业时,需要在脚本中更改位置和内容,以便将字符串转换为redshift timestamp

S3文件中的日期格式为“yyyy.mm.dd HH:mi:ss”

下面是脚本

import sys
from awsglue.transforms import *
from awsglue.utils import getResolvedOptions
from pyspark.context import SparkContext
from awsglue.context import GlueContext
from awsglue.job import Job

## @params: [TempDir, JOB_NAME]
args = getResolvedOptions(sys.argv, ['TempDir','JOB_NAME'])

sc = SparkContext()
glueContext = GlueContext(sc)
spark = glueContext.spark_session
job = Job(glueContext)
job.init(args['JOB_NAME'], args)
## @type: DataSource
## @args: [database = "", table_name = "", transformation_ctx = "datasource0"]
## @return: datasource0
## @inputs: []
datasource0 = glueContext.create_dynamic_frame.from_catalog(database = "", table_name = "", transformation_ctx = "datasource0")
## @type: ApplyMapping
## @args: [mapping = [("mrp", "long", "mrp", "decimal(10,2)"), ("mop", "double", "mop", "decimal(10,2)"), ("mop_update_timestamp", "string", "mop_update_timestamp", "timestamp"), ("special_price", "long", "special_price", "decimal(10,2)"), ("promotion_identifier", "string", "promotion_identifier", "string"), ("is_percentage_promotion", "string", "is_percentage_promotion", "string"), ("promotion_value", "string", "promotion_value", "decimal(10,2)"), ("max_discount", "long", "max_discount", "decimal(10,2)"), ("promotion_start_date", "string", "promotion_start_date", "timestamp"), ("promotion_end_date", "string", "promotion_end_date", "timestamp")], transformation_ctx = "applymapping1"]
## @return: applymapping1
## @inputs: [frame = datasource0]
applymapping1 = ApplyMapping.apply(frame = datasource0, mappings = [ ("mrp", "long", "mrp", "decimal(10,2)"), ("mop", "double", "mop", "decimal(10,2)"), ("mop_update_timestamp", "string", "mop_update_timestamp", "timestamp"), ("special_price", "long", "special_price", "decimal(10,2)"), ("promotion_identifier", "string", "promotion_identifier", "string"), ("is_percentage_promotion", "string", "is_percentage_promotion", "string"), ("promotion_value", "string", "promotion_value", "decimal(10,2)"), ("max_discount", "long", "max_discount", "decimal(10,2)"), ("promotion_start_date", "string", "promotion_start_date", "timestamp"), ("promotion_end_date", "string", "promotion_end_date", "timestamp")], transformation_ctx = "applymapping1")
## @type: ResolveChoice
## @args: [choice = "make_cols", transformation_ctx = "resolvechoice2"]
## @return: resolvechoice2
## @inputs: [frame = applymapping1]
resolvechoice2 = ResolveChoice.apply(frame = applymapping1, choice = "make_cols", transformation_ctx = "resolvechoice2")
## @type: DropNullFields
## @args: [transformation_ctx = "dropnullfields3"]
## @return: dropnullfields3
## @inputs: [frame = resolvechoice2]
dropnullfields3 = DropNullFields.apply(frame = resolvechoice2, transformation_ctx = "dropnullfields3")
## @type: DataSink
## @args: [catalog_connection = "", connection_options = {"dbtable": "", "database": ""}, redshift_tmp_dir = TempDir, transformation_ctx = "datasink4"]
## @return: datasink4
## @inputs: [frame = dropnullfields3]
datasink4 = glueContext.write_dynamic_frame.from_jdbc_conf(frame = dropnullfields3, catalog_connection = "", connection_options = {"dbtable": "", "database": ""}, redshift_tmp_dir = args["TempDir"], transformation_ctx = "datasink4")
job.commit()

您是否尝试将其转换为数据帧,然后转换为时间戳,因为它是'yyyy.mm.dd HH:mi:ss'格式?大概是这样的:

## Add this in order to use DynamicFrame.fromDF
from awsglue.dynamicframe import DynamicFrame

## Make a dataframe 
df_datasource0 = datasource0.toDF()

## add a column mop_update_timestamp_ts where you cast mop_update_timestamp to a timestamp
df_datasource0 = df_datasource0.withColumn('mop_update_timestamp_ts',df_datasource0.mop_update_timestamp.cast('timestamp'))

## Transform the dataframe back to a dynamic frame again 
datasource0 = DynamicFrame.fromDF(df_datasource0, glueContext, "datasource0") 

## Use the mop_update_timestamp_ts column instead as below. 
applymapping1 = ApplyMapping.apply(frame = datasource0, mappings = [ ("mrp", "long", "mrp", "decimal(10,2)"), ("mop", "double", "mop", "decimal(10,2)"), ("mop_update_timestamp_ts", "timestamp", "mop_update_timestamp", "timestamp"), ("special_price", "long", "special_price", "decimal(10,2)"), ("promotion_identifier", "string", "promotion_identifier", "string"), ("is_percentage_promotion", "string", "is_percentage_promotion", "string"), ("promotion_value", "string", "promotion_value", "decimal(10,2)"), ("max_discount", "long", "max_discount", "decimal(10,2)"), ("promotion_start_date", "string", "promotion_start_date", "timestamp"), ("promotion_end_date", "string", "promotion_end_date", "timestamp")], transformation_ctx = "applymapping1")

让我知道它是否适合您

您是否尝试将其转换为数据帧,然后转换为时间戳,因为您使用的是'yyyy.mm.dd HH:mi:ss'格式?大概是这样的:

## Add this in order to use DynamicFrame.fromDF
from awsglue.dynamicframe import DynamicFrame

## Make a dataframe 
df_datasource0 = datasource0.toDF()

## add a column mop_update_timestamp_ts where you cast mop_update_timestamp to a timestamp
df_datasource0 = df_datasource0.withColumn('mop_update_timestamp_ts',df_datasource0.mop_update_timestamp.cast('timestamp'))

## Transform the dataframe back to a dynamic frame again 
datasource0 = DynamicFrame.fromDF(df_datasource0, glueContext, "datasource0") 

## Use the mop_update_timestamp_ts column instead as below. 
applymapping1 = ApplyMapping.apply(frame = datasource0, mappings = [ ("mrp", "long", "mrp", "decimal(10,2)"), ("mop", "double", "mop", "decimal(10,2)"), ("mop_update_timestamp_ts", "timestamp", "mop_update_timestamp", "timestamp"), ("special_price", "long", "special_price", "decimal(10,2)"), ("promotion_identifier", "string", "promotion_identifier", "string"), ("is_percentage_promotion", "string", "is_percentage_promotion", "string"), ("promotion_value", "string", "promotion_value", "decimal(10,2)"), ("max_discount", "long", "max_discount", "decimal(10,2)"), ("promotion_start_date", "string", "promotion_start_date", "timestamp"), ("promotion_end_date", "string", "promotion_end_date", "timestamp")], transformation_ctx = "applymapping1")

让我知道它是否适用于您

您可以尝试
(“促销结束日期”、“促销结束日期”、“时间戳”)
,在您的脚本中设置时间戳列吗?您好,我会尝试这个,但是程序如何知道时间戳输入格式,在某个地方我也必须给出格式。它应该会自动处理这个问题。让我知道它是如何工作的,不使用数据库中的列值为空。在写入红移之前是否尝试转换为时间戳格式?能否尝试
(“升级结束日期”、“升级结束日期”、“时间戳”)
在脚本中输入时间戳列?嗨,我会尝试,但程序如何知道时间戳输入格式,在某个地方我也必须给出格式。它应该自动处理。让我知道它是如何处理的,不在数据库中处理列值是空的。在写入红移之前,您是否尝试转换为时间戳格式?