Python S3toredShift操作员无法获取凭据

Python S3toredShift操作员无法获取凭据,python,airflow,airflow-operator,Python,Airflow,Airflow Operator,我正在使用airflow 2.x,我正在尝试使用S3toredShift操作符将数据文件从S3移动到红移 s3_to_redshift = S3ToRedshiftOperator( task_id = 'Copy_from_S3_to_Redshift', aws_conn_id='S3_connection', redshift_conn_id='Redshift_conn', schema='public',

我正在使用airflow 2.x,我正在尝试使用S3toredShift操作符将数据文件从S3移动到红移

    s3_to_redshift = S3ToRedshiftOperator(
        task_id = 'Copy_from_S3_to_Redshift',
        aws_conn_id='S3_connection',
        redshift_conn_id='Redshift_conn',
        schema='public',
        table='department',
        s3_bucket='xxxxx',
        s3_key='department.csv'
    )
但它失败了,并抛出了错误

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 1138, in _run_raw_task
    self._prepare_and_execute_task_with_callbacks(context, task)
  File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks
    result = self._execute_task(context, task_copy)
  File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 1341, in _execute_task
    result = task_copy.execute(context=context)
  File "/usr/local/lib/python3.8/dist-packages/airflow/providers/amazon/aws/transfers/s3_to_redshift.py", line 111, in execute
    credentials = s3_hook.get_credentials()
  File "/usr/local/lib/python3.8/dist-packages/airflow/providers/amazon/aws/hooks/base_aws.py", line 478, in get_credentials
    return session.get_credentials().get_frozen_credentials()
AttributeError: 'NoneType' object has no attribute 'get_frozen_credentials'
我已经在Airflow UI的connections选项卡中创建了S3连接和红移连接

注意:我的红移在EC2集群上,我在.ssh/config文件中本地转发端口,然后使用主机名Localhost进行连接。此过程在Dbeaver或任何数据库客户机中工作

任何帮助都将不胜感激