Airflow ssh操作员错误cryptography.fernet.InvalidToken

Airflow ssh操作员错误cryptography.fernet.InvalidToken,airflow,airflow-scheduler,airflow-operator,Airflow,Airflow Scheduler,Airflow Operator,我安装了airflow并创建了一个简单的DAG,它使用ssh操作符在不同的服务器上执行命令。我从GUI(管理>连接>创建)添加了一个ssh连接slurm_core。然而,我得到以下例外 [2020-12-03 11:01:51,385] {ssh_operator.py:91} INFO - ssh_hook is not provided or invalid. Trying ssh_conn_id to create SSHHook. [2020-12-03 11:01:51,392] {t

我安装了airflow并创建了一个简单的DAG,它使用ssh操作符在不同的服务器上执行命令。我从GUI(管理>连接>创建)添加了一个ssh连接
slurm_core
。然而,我得到以下例外

[2020-12-03 11:01:51,385] {ssh_operator.py:91} INFO - ssh_hook is not provided or invalid. Trying ssh_conn_id to create SSHHook.
[2020-12-03 11:01:51,392] {taskinstance.py:1150} ERROR - SSH operator error: 
Traceback (most recent call last):
  File "/home/airflowuser/miniconda3/lib/python3.7/site-packages/airflow/contrib/operators/ssh_operator.py", line 94, in execute
    timeout=self.timeout)
  File "/home/airflowuser/miniconda3/lib/python3.7/site-packages/airflow/contrib/hooks/ssh_hook.py", line 92, in __init__
    conn = self.get_connection(self.ssh_conn_id)
  File "/home/airflowuser/miniconda3/lib/python3.7/site-packages/airflow/hooks/base_hook.py", line 89, in get_connection
    log.info("Using connection to: %s", conn.log_info())
  File "/home/airflowuser/miniconda3/lib/python3.7/site-packages/airflow/models/connection.py", line 322, in log_info
    "XXXXXXXX" if self.password else None,
  File "/home/airflowuser/miniconda3/lib/python3.7/site-packages/sqlalchemy/orm/attributes.py", line 358, in __get__
    retval = self.descriptor.__get__(instance, owner)
  File "/home/airflowuser/miniconda3/lib/python3.7/site-packages/airflow/models/connection.py", line 192, in get_password
    return fernet.decrypt(bytes(self._password, 'utf-8')).decode()
  File "/home/airflowuser/miniconda3/lib/python3.7/site-packages/cryptography/fernet.py", line 171, in decrypt
    raise InvalidToken
cryptography.fernet.InvalidToken
我尝试了
气流重置数据库
气流初始化数据库
,并在mysql中删除和创建了数据库。我不知道这里发生了什么。下面是我的DAG

from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator
from airflow.contrib.operators.ssh_operator import SSHOperator
from airflow.utils.dates import days_ago
from datetime import timedelta

default_args = {
    "owner" : "John Doe",
    "email" : ['JohnDoe@apple.com'],
    "email_on_failure": True,
    "email_on_retry": False,
    "retries": 6,
    "retry_delay": timedelta(minutes=10),
    'start_date': days_ago(10),
}

dag = DAG(
    "check_archives",
    catchup = False,
    default_args = default_args,
    orientation = "TB",
    schedule_interval = "00 02 * * 0-6",
    max_active_runs = 2,
)

dag.doc_md = "<b>Check Archives</b>"

check_archive_ready = SSHOperator(
    task_id='check_archive_ready',
    ssh_conn_id = "slurm_core",
    command=compare_archive_dates,
    dag=dag,
)

run_archive_check = SSHOperator(
    task_id='run_archive_check ',
    ssh_conn_id = "slurm_core",
    command="/home/myuser/miniconda3/bin/python /home/myuser/scripts/archive_check.py "
    dag=dag,
)

check_archive_ready.set_downstream(run_archive_check)
Python 3.7.7 (default, May  7 2020, 21:25:33) 
[GCC 7.3.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from airflow import configuration as conf
>>> conf.get('core','fernet_key')
'dz7xH0vsA0ZZz4NjtAY7IMm0LiUhFYZ1qipSd8QfZcw='