Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/docker/10.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
在气流上使用docker操作符安装目录不起作用_Docker_Airflow_Airflow Operator - Fatal编程技术网

在气流上使用docker操作符安装目录不起作用

在气流上使用docker操作符安装目录不起作用,docker,airflow,airflow-operator,Docker,Airflow,Airflow Operator,我正在尝试使用docker操作符使用airflow自动执行一些脚本 气流版本:apache气流==1.10.12 我要做的是使用此代码将我的项目的所有文件(包括文件夹和文件)“复制”到容器中 以下文件ml intermediate.py位于该目录下~/aiffort/dags/ml intermediate.py: """ Template to convert a Ploomber DAG to Airflow """ from ai

我正在尝试使用docker操作符使用airflow自动执行一些脚本

气流版本:
apache气流==1.10.12

我要做的是使用此代码将我的项目的所有文件(包括文件夹和文件)“复制”到容器中

以下文件
ml intermediate.py
位于该目录下
~/aiffort/dags/ml intermediate.py

"""
Template to convert a Ploomber DAG to Airflow
"""
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.utils.dates import days_ago

from ploomber.spec import DAGSpec
from soopervisor.script.ScriptConfig import ScriptConfig

script_cfg = ScriptConfig.from_path('/home/letyndr/airflow/dags/ml-intermediate')
# Replace the project root to reflect the new location - or maybe just
# write a soopervisor.yaml, then we can we rid of this line
script_cfg.paths.project = '/home/letyndr/airflow/dags/ml-intermediate'

# TODO: use lazy_import from script_cfg
dag_ploomber = DAGSpec('/home/letyndr/airflow/dags/ml-intermediate/pipeline.yaml',
                       lazy_import=True).to_dag()
dag_ploomber.name = "ML Intermediate"

default_args = {
    'start_date': days_ago(0),
}

dag_airflow = DAG(
    dag_ploomber.name.replace(' ', '-'),
    default_args=default_args,
    description='Ploomber dag',
    schedule_interval=None,
)

script_cfg.save_script()

from airflow.operators.docker_operator import DockerOperator
for task_name in dag_ploomber:
    DockerOperator(task_id=task_name,
        image="continuumio/miniconda3",
        api_version="auto",
        auto_remove=True,
        # command="sh /home/letyndr/airflow/dags/ml-intermediate/script.sh",
        command="sleep 600",
        docker_url="unix://var/run/docker.sock",
        volumes=[
            "/home/letyndr/airflow/dags/ml-intermediate:/home/letyndr/airflow/dags/ml-intermediate:rw",
            "/home/letyndr/airflow-data/ml-intermediate:/home/letyndr/airflow-data/ml-intermediate:rw"
        ],
        working_dir=script_cfg.paths.project,
        dag=dag_airflow,
        container_name=task_name,
    )



for task_name in dag_ploomber:
    task_ploomber = dag_ploomber[task_name]
    task_airflow = dag_airflow.get_task(task_name)

    for upstream in task_ploomber.upstream:
        task_airflow.set_upstream(dag_airflow.get_task(upstream))

dag = dag_airflow
当我使用Airflow执行这个DAG时,我得到一个错误,docker没有找到
/home/letyndr/Airflow/dags/ml intermediate/script.sh
脚本。我更改了docker操作符sleep 600的执行命令,以进入容器并使用正确的路径检查容器中的文件

例如,当我在容器中时,我可以转到这个路径
/home/letyndr/aiffort/dags/ml intermediate/
,但我看不到应该在那里的文件

我尝试复制Airflow如何检查包的这一部分,特别是创建docker容器的这一部分:

这是docker实现的一个复制:

import docker

client = docker.APIClient()

# binds = {
#         "/home/letyndr/airflow/dags": {
#             "bind": "/home/letyndr/airflow/dags",
#             "mode": "rw"
#         },
#         "/home/letyndr/airflow-data/ml-intermediate": {
#             "bind": "/home/letyndr/airflow-data/ml-intermediate",
#             "mode": "rw"
#         }
#     }

binds = ["/home/letyndr/airflow/dags:/home/letyndr/airflow/dags:rw",
"/home/letyndr/airflow-data/ml-intermediate:/home/letyndr/airflow-data/ml-intermediate:rw"]

container = client.create_container(
    image="continuumio/miniconda3",
    command="sleep 600",
    volumes=["/home/letyndr/airflow/dags", "/home/letyndr/airflow-data/ml-intermediate"],
    host_config=client.create_host_config(binds=binds),
    working_dir="/home/letyndr/airflow/dags",
    name="simple_example",
)

client.start(container=container.get("Id"))

我发现,只有在设置了
host\u config
volumes
的情况下,装载卷才能工作,问题是,在aiffaire上的实现只设置了
host\u config
,而没有设置
。我在方法
create\u container
上添加了参数,它成功了

你知道我是否正确使用了docker操作符,还是这是一个问题