Python Airflow Airflow.Exception.AirflowException:无法创建远程临时文件SSHExecuteOperator

Python Airflow Airflow.Exception.AirflowException:无法创建远程临时文件SSHExecuteOperator,python,airflow,directed-acyclic-graphs,airflow-scheduler,Python,Airflow,Directed Acyclic Graphs,Airflow Scheduler,我试图在气流中运行简单的SSHExecutorOperator 这是我的.py文件: from airflow.contrib.hooks.ssh_hook import SSHHook from datetime import timedelta default_args = { 'owner': 'airflow', 'start_date':airflow.utils.dates.days_ago(2), 'retries': 3 } dag = DAG('Nas_Hdfs', de

我试图在气流中运行简单的SSHExecutorOperator

这是我的.py文件:

from airflow.contrib.hooks.ssh_hook import SSHHook
from datetime import timedelta

default_args = {
'owner': 'airflow',
'start_date':airflow.utils.dates.days_ago(2),
'retries': 3

}

dag = DAG('Nas_Hdfs', description='Simple tutorial DAG',
      schedule_interval=None,default_args=default_args,
      catchup=False)

sshHook = SSHHook(conn_id='101')
sshHook.no_host_key_check = True

t2 = SSHExecuteOperator(task_id="NAS_TO_HDFS_FILE_COPY",
bash_command="hostname ",
ssh_hook=sshHook,
    dag=dag
    )

t2
连接id 101如下所示:

我得到以下错误:

ERROR - Failed to create remote temp file
以下是完整的日志:

INFO - Subtask: --------------------------------------------------------------------------------
INFO - Subtask: Starting attempt 1 of 4
INFO - Subtask: --------------------------------------------------------------------------------
INFO - Subtask: 
INFO - Subtask: [2018-05-28 08:54:22,812] {models.py:1342} INFO - Executing <Task(SSHExecuteOperator): NAS_TO_HDFS_FILE_COPY> on 2018-05-28 08:54:12.876538
INFO - Subtask: [2018-05-28 08:54:23,303] {models.py:1417} ERROR - Failed to create remote temp file
INFO - Subtask: Traceback (most recent call last):
INFO - Subtask:   File "/opt/miniconda3/lib/python2.7/site-packages/airflow/models.py", line 1374, in run
INFO - Subtask:     result = task_copy.execute(context=context)
INFO - Subtask:   File "/opt/miniconda3/lib/python2.7/site-packages/airflow/contrib/operators/ssh_execute_operator.py", line 128, in execute
INFO - Subtask:     self.task_id) as remote_file_path:
INFO - Subtask:   File "/opt/miniconda3/lib/python2.7/site-packages/airflow/contrib/operators/ssh_execute_operator.py", line 64, in __enter__
INFO - Subtask:     raise AirflowException("Failed to create remote temp file")
INFO - Subtask: AirflowException: Failed to create remote temp file
INFO - Subtask: [2018-05-28 08:54:23,304] {models.py:1433} INFO - Marking task as UP_FOR_RETRY
INFO - Subtask: [2018-05-28 08:54:23,342] {models.py:1462} ERROR - Failed to create remote temp file
INFO - Subtask: Traceback (most recent call last):
INFO - Subtask:   File "/opt/miniconda3/bin/airflow", line 28, in <module>
INFO - Subtask:     args.func(args)
INFO - Subtask:   File "/opt/miniconda3/lib/python2.7/site-packages/airflow/bin/cli.py", line 422, in run
INFO - Subtask:     pool=args.pool,
INFO - Subtask:   File "/opt/miniconda3/lib/python2.7/site-packages/airflow/utils/db.py", line 53, in wrapper
INFO - Subtask:     result = func(*args, **kwargs)
INFO - Subtask:   File "/opt/miniconda3/lib/python2.7/site-packages/airflow/models.py", line 1374, in run
INFO - Subtask:     result = task_copy.execute(context=context)
INFO - Subtask:   File "/opt/miniconda3/lib/python2.7/site-packages/airflow/contrib/operators/ssh_execute_operator.py", line 128, in execute
INFO - Subtask:     self.task_id) as remote_file_path:
INFO - Subtask:   File "/opt/miniconda3/lib/python2.7/site-packages/airflow/contrib/operators/ssh_execute_operator.py", line 64, in __enter__
INFO - Subtask:     raise AirflowException("Failed to create remote temp file")
INFO - Subtask: airflow.exceptions.AirflowException: Failed to create remote temp file
INFO - Task exited with return code 1
输出:

确保遵循以下3个步骤:

  • 使用ssh密钥而不是密码
  • “密钥文件”使用id\u rsa文件而不是id\u rsa.pub
  • 您需要所有者和权限0600才能访问id_rsa和id_rsa.pub文件

  • 这似乎是直接相关的:@tobi6:在将其发布到stackoverflow之前,我已经关注了该链接,这似乎不起作用,我尝试了-chmod 600 id_rsa,根据问题中的图像,还包括了额外的内容。总是有助于将您的研究包括在问题中,以便我们知道您以前做了什么。谢谢@tobi6,请记住,这实际上是最后一站,因为我已经访问了所有其他链接。。非常感谢您的帮助。进入Python shell,模拟您的Airflow用户和环境,并执行以下操作:从Airflow.contrib.hooks.ssh_hook导入SSHHook SSHHook=SSHHook(conn_id='101')SSHHook.no_host_key_check=True SSHHook.Popen([“-q”,“mktemp”,“--tmpdir”,“tmp_XXXXXX]”)请粘贴输出。嗨,,执行了所有这些步骤…但仍然出现相同的错误“Airflow Airflow.Exception.AirflowException:未能创建远程临时文件SSHExecuteOperator”
    from airflow.contrib.hooks.ssh_hook import SSHHook 
    sshHook = SSHHook(conn_id='101') 
    sshHook.no_host_key_check = True 
    sshHook.Popen(["-q", "mktemp", "--tmpdir", "tmp_XXXXXX"])