Python 如何使用Airflow存储SQL查询结果,并在if-else条件下使用结果?

Python 如何使用Airflow存储SQL查询结果,并在if-else条件下使用结果?,python,python-3.x,airflow,apache-airflow-xcom,Python,Python 3.x,Airflow,Apache Airflow Xcom,我试图使用XCOM,但它的值显示为none class DWPostgresReturn(DWPostgresOperator): def execute(self, context): self.log.info('Executing: %s', self.sql) hook = DWPostgresHook(postgres_conn_id=self.postgres_conn_id, schema=self.database) ret

我试图使用XCOM,但它的值显示为none

class DWPostgresReturn(DWPostgresOperator):
    def execute(self, context):
        self.log.info('Executing: %s', self.sql)
        hook = DWPostgresHook(postgres_conn_id=self.postgres_conn_id, schema=self.database)
        return hook.get_records(
            self.sql,
            parameters=self.parameters)

with DWDAG(config) as dag:

    t1 = DWPostgresReturn(
       task_id='t1',
       postgres_conn_id='db_conn',
       sql="select output from table",
       config=config,
       start_date = dt.datetime(2020,3,9)

    )

    def get_records(**kwargs):
       ti = kwargs['ti']
       xcom = ti.xcom_pull(task_ids='t1')   
       string_to_print = 'Value in xcom is: {}'.format(xcom)
       print(xcom)



    t2 = PythonOperator(
       task_id='records',
       provide_context=True,
       python_callable=get_records,
       config = config,
       start_date = dt.datetime(2020,3,9)


    )

t1 >> t2

您可以使用钩子来获取sql查询的结果,并根据需要使用它。例如雪花:

dwh_hook = SnowflakeHook(snowflake_conn_id=SNOWFLAKE_CONNECTION_ID)
query_result = dwh_hook.get_first("""<your query>""")[0]
dwh\u hook=snowflake hook(snowflake\u conn\u id=snowflake\u CONNECTION\u id)
query\u result=dwh\u hook.get\u first(“”)[0]

请您详细介绍一下环境。有堆栈跟踪吗?没有这些细节,没有人能帮助你。下面是记录气流的任务ID=记录气流的执行日期=2020-03-05T11:14:46.675046+00:00气流的运行ID=手动的\u 2020-03-05T11:14:46.675046+00:00[2020-03-09 13:16:28681]{logging\u mixin.py:95}INFO-[2020-03-09-13:16:28681]{demographics}63}信息-xcom中的值为:非什么任务将此键的值放入xcom中?