Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/347.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/google-cloud-platform/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 气流错误-获得意外的关键字参数';最小值';_Python_Google Cloud Platform_Airflow_Google Cloud Composer_Papermill - Fatal编程技术网

Python 气流错误-获得意外的关键字参数';最小值';

Python 气流错误-获得意外的关键字参数';最小值';,python,google-cloud-platform,airflow,google-cloud-composer,papermill,Python,Google Cloud Platform,Airflow,Google Cloud Composer,Papermill,我正试图运行一个非常简单的测试DAG来掌握GCP Cloud Composer的基本功能,但每次我触发DAG时,都会弹出一个严重的错误,我似乎找不到任何关于如何解决它的信息 错误是: 2020-03-18 22:20:56,627] {taskinstance.py:1059} ERROR - __init__() got an unexpected keyword argument 'min'@-@{"workflow": "notebook-test", "task-id": "notebo

我正试图运行一个非常简单的测试DAG来掌握GCP Cloud Composer的基本功能,但每次我触发DAG时,都会弹出一个严重的错误,我似乎找不到任何关于如何解决它的信息

错误是:

2020-03-18 22:20:56,627] {taskinstance.py:1059} ERROR - __init__() got an unexpected keyword argument 'min'@-@{"workflow": "notebook-test", "task-id": "notebook-test", "execution-date": "2020-03-18T22:20:41.232043+00:00"}
Traceback (most recent call last):
  File "/usr/local/lib/airflow/airflow/models/taskinstance.py", line 930, in _run_raw_task
    result = task_copy.execute(context=context)
  File "/usr/local/lib/airflow/airflow/operators/python_operator.py", line 113, in execute
    return_value = self.execute_callable()
  File "/usr/local/lib/airflow/airflow/operators/python_operator.py", line 118, in execute_callable
    return self.python_callable(*self.op_args, **self.op_kwargs)
  File "/home/airflow/gcs/dags/test.py", line 44, in execute_nb
    parameters=params
  File "/opt/python3.6/lib/python3.6/site-packages/papermill/execute.py", line 104, in execute_notebook
    **engine_kwargs
  File "/opt/python3.6/lib/python3.6/site-packages/papermill/engines.py", line 49, in execute_notebook_with_engine
    return self.get_engine(engine_name).execute_notebook(nb, kernel_name, **kwargs)
  File "/opt/python3.6/lib/python3.6/site-packages/papermill/engines.py", line 341, in execute_notebook
    nb_man.notebook_start()
  File "/opt/python3.6/lib/python3.6/site-packages/papermill/engines.py", line 69, in wrapper
    return func(self, *args, **kwargs)
  File "/opt/python3.6/lib/python3.6/site-packages/papermill/engines.py", line 198, in notebook_start
    self.save()
  File "/opt/python3.6/lib/python3.6/site-packages/papermill/engines.py", line 69, in wrapper
    return func(self, *args, **kwargs)
  File "/opt/python3.6/lib/python3.6/site-packages/papermill/engines.py", line 139, in save
    write_ipynb(self.nb, self.output_path)
  File "/opt/python3.6/lib/python3.6/site-packages/papermill/iorw.py", line 397, in write_ipynb
    papermill_io.write(nbformat.writes(nb), path)
  File "/opt/python3.6/lib/python3.6/site-packages/papermill/iorw.py", line 128, in write
    return self.get_handler(path).write(buf, path)
  File "/opt/python3.6/lib/python3.6/site-packages/papermill/iorw.py", line 316, in write
    multiplier=self.RETRY_MULTIPLIER, min=self.RETRY_DELAY, max=self.RETRY_MAX_DELAY
TypeError: __init__() got an unexpected keyword argument 'min'
我的DAG代码是:

import airflow
import papermill as pm
from datetime import timedelta
from airflow import DAG
from airflow.operators.python_operator import PythonOperator


default_args = {
    'owner': 'airflow',
    'start_date': airflow.utils.dates.days_ago(1),
    'end_date': None,
    'retries': 0,    
    'retry_delay': timedelta(minutes=5)
}

dag = DAG(
    dag_id="notebook-test",
    description="a test",
    default_args=default_args,
    catchup=True,
    schedule_interval=None,
    dagrun_timeout=(timedelta(seconds=30))
)

NB_PATH = "gs://BUCKET/data/"

params = {}


def execute_nb():
    input_nb = NB_PATH + "test.ipynb"
    output_nb = NB_PATH + "test_ran.ipynb"

    pm.execute_notebook(
        input_nb,
        output_nb,
        parameters=params
    )


op = PythonOperator(
    task_id="notebook-test",
    python_callable=execute_nb,
    dag=dag
)

op
我已经尝试过的一个解决方案是更新Tensity的版本,但将其添加到我的Cloud Composer环境的PyPi Packages选项卡并没有解决任何问题

任何帮助都将不胜感激,谢谢


编辑:图像版本为composer-1.9.2-airflow.1.10.6

因此,问题与提供的路径有关

我必须添加
导入操作系统
以及从pathlib导入路径添加
,然后生成我的变量

NB\u PATH=str(PATH(os.PATH.abspath(\uu文件)).parents[1])+“/data”


这还需要我添加
jupyter
作为PyPi依赖项,以便papermill正常工作,但它现在似乎正在工作

您使用的是哪个版本的气流?@muscat图像版本是composer-1.9.2-airflow.1.10.6。@Emma感谢您的建议,但我已经将其列为我尝试过的解决方案之一。