Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/329.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
python脚本失败时Dag不会失败_Python_Python 2.7_Airflow_Directed Acyclic Graphs - Fatal编程技术网

python脚本失败时Dag不会失败

python脚本失败时Dag不会失败,python,python-2.7,airflow,directed-acyclic-graphs,Python,Python 2.7,Airflow,Directed Acyclic Graphs,我正在使用一些python脚本从ftp服务器下载文件,并为此创建了DAG。有时,从FTP服务器下载文件时,python脚本会在“对等连接重置错误”时失败,但DAG不会失败,它会将任务标记为成功,而不是失败 Below are airflow logs for more information. [2019-01-03 19:04:40,085] {base_task_runner.py:98} INFO - Subtask: [2019-01-03 19:04:40,085] {ssh_ex

我正在使用一些python脚本从ftp服务器下载文件,并为此创建了DAG。有时,从FTP服务器下载文件时,python脚本会在“对等连接重置错误”时失败,但DAG不会失败,它会将任务标记为成功,而不是失败

Below are airflow logs for more information.


[2019-01-03 19:04:40,085] {base_task_runner.py:98} INFO - Subtask: [2019-01-03 19:04:40,085] {ssh_execute_operator.py:146} INFO - [2019-01-03 19:09:14,276 - Download files from SFTP - ERROR] Total 1 file(s) ([u'R0000797-Manifest.xml']) are downloaded successfully. One error is found in downloading file xxxxxx.txt due to Server connection dropped:
[2019-01-03 19:04:40,091] {base_task_runner.py:98} INFO - Subtask: [2019-01-03 19:04:40,090] {ssh_execute_operator.py:146} INFO - [2019-01-03 19:09:14,282 - Download files from SFTP - ERROR] The whole process failed due to Server connection dropped: .
[2019-01-03 19:04:40,091] {base_task_runner.py:98} INFO - Subtask: [2019-01-03 19:04:40,091] {ssh_execute_operator.py:146} INFO - Total 1 file(s) ([u'R0000797-Manifest.xml']) are downloaded successfully.
[2019-01-03 19:04:40,092] {base_task_runner.py:98} INFO - Subtask: [2019-01-03 19:04:40,091] {ssh_execute_operator.py:146} INFO - Traceback (most recent call last):
[2019-01-03 19:04:40,092] {base_task_runner.py:98} INFO - Subtask: [2019-01-03 19:04:40,091] {ssh_execute_operator.py:146} INFO - main(args)
[2019-01-03 19:04:40,092] {base_task_runner.py:98} INFO - Subtask: [2019-01-03 19:04:40,091] {ssh_execute_operator.py:146} INFO - File "/TEST/GET_files.py", line 381, in main
[2019-01-03 19:04:40,093] {base_task_runner.py:98} INFO - Subtask: [2019-01-03 19:04:40,092] {ssh_execute_operator.py:146} INFO - sftp.get(source_file)


As you can see from above logs that python script gave a proper error message to airflow handler but airflow handler shows that message as INFO and it doesn't fail. So please can you suggest me or help me in this scenario? I want to fail the DAG task when any python error occurs. 

************************************
here is the dag code

get_files = SSHExecuteOperator(
    task_id='get_files',
    bash_command=command to run the py script,
    ssh_hook=sshHook,
    dag=dag)

************************************

Expected results: The airflow DAG should fail when python script fails.
Thanks for your help in advance.

添加
set-e
到您的
bash\u命令
。例如:

get_files = SSHExecuteOperator(
    task_id='get_files',
    bash_command='set -e; python example_script.py',
    ssh_hook=sshHook,
    dag=dag)

你为什么不用蟒蛇机呢

示例如下:


Doc here:

你能发布python代码吗?看起来您可能正在捕获记录失败。您可能需要引发一个异常,以便它不会以值1而不是0退出。我面临与OP相同的问题,您的解决方案似乎对他有效,但对我来说,它不起作用。你可以看到我的狗
python3
没有脚本所需的包/模块&因此在终端上运行时会出现错误,但在终端上会显示
success
状态。你能检查一下我做错了什么吗?@saadi你需要在你的
bash_命令中有
set-e
,你有
set-e
:)