Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/docker/9.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Airflow Docker操作员在本地计算机上找不到.sock文件_Docker_Airflow - Fatal编程技术网

Airflow Docker操作员在本地计算机上找不到.sock文件

Airflow Docker操作员在本地计算机上找不到.sock文件,docker,airflow,Docker,Airflow,我想运行一个docker容器,其中包含一个python脚本,并按计划运行。在本地通过Airflow CLI运行DockerOperator任务时遇到问题 -------------------------------------------------------------------------------- Starting attempt 1 of 4 --------------------------------------------------------------------

我想运行一个docker容器,其中包含一个python脚本,并按计划运行。在本地通过Airflow CLI运行DockerOperator任务时遇到问题

--------------------------------------------------------------------------------
Starting attempt 1 of 4
--------------------------------------------------------------------------------

[2018-10-31 15:20:10,760] {models.py:1569} INFO - Executing <Task(DockerOperator): amplitude_to_s3_docker> on 2018-10-02T00:00:00+00:00
[2018-10-31 15:20:10,761] {base_task_runner.py:124} INFO - Running: ['bash', '-c', 'airflow run get_amplitude_docker_dag amplitude_to_s3_docker 2018-10-02T00:00:00+00:00 --job_id 19 --raw -sd DAGS_FOLDER/amplitude_to_s3_docker_dag.py --cfg_path /var/folders/ys/83xq3b3d1qv3zfx3dtkkp9tc0000gn/T/tmp_lu9mgzz']
[2018-10-31 15:20:12,501] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:12,501] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-10-31 15:20:13,465] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,464] {models.py:258} INFO - Filling up the DagBag from /Users/thisuser/Projects/GitRepos/DataWarehouse/dags/amplitude_to_s3_docker_dag.py
[2018-10-31 15:20:13,581] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,581] {example_kubernetes_operator.py:54} WARNING - Could not import KubernetesPodOperator: No module named 'kubernetes'
[2018-10-31 15:20:13,582] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,582] {example_kubernetes_operator.py:55} WARNING - Install kubernetes dependencies with:     pip install airflow['kubernetes']
[2018-10-31 15:20:13,770] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,770] {cli.py:492} INFO - Running <TaskInstance: get_amplitude_docker_dag.amplitude_to_s3_docker 2018-10-02T00:00:00+00:00 [running]> on host 254.1.168.192.in-addr.arpa
[2018-10-31 15:20:13,804] {docker_operator.py:169} INFO - Starting docker container from image amplitude
[2018-10-31 15:20:13,974] {models.py:1736} ERROR - create_container() got an unexpected keyword argument 'cpu_shares'
Traceback (most recent call last):
  File "/Users/thisuser/anaconda/lib/python3.5/site-packages/airflow/models.py", line 1633, in _run_raw_task
    result = task_copy.execute(context=context)
  File "/Users/thisuser/anaconda/lib/python3.5/site-packages/airflow/operators/docker_operator.py", line 210, in execute
    working_dir=self.working_dir
TypeError: create_container() got an unexpected keyword argument 'cpu_shares'
初始化本地airflow db并启动webserver+调度程序后,我使用以下命令运行dag任务:

气流运行获取\u振幅\u docker\u dag振幅\u至\u s3\u docker 2018-10-02

此外,如果我将任务配置为bash操作符,它将在气流中正常运行:

templated_command = """
   docker run amplitude get_amplitude.py {{ ds }} {{ ds }} 
"""


t1 = BashOperator(
    task_id="amplitude_to_s3",
    bash_command=templated_command,
    params={},
    dag=dag,
)
我之前读过,装载docker守护进程可能会出现问题,但我的.sock文件位于默认的
docker\u url
参数指向/var/run/docker.sock的位置


有人能帮我配置这个作业吗

实际错误是TypeError:create\u container()得到了一个意外的关键字参数'cpu\u shares',这意味着
create\u container
函数不希望将
cpu\u shares
作为参数

使用docker python库版本3.5.1并降级到版本2.7.0(这似乎是接受
cpu\u共享的
参数的最新版本,用于
create\u container
),我遇到了相同的错误,修复了此问题

尝试运行此操作以降级docker库:

sudo pip3 install docker==2.7.0

您可能还想检查Docker是否在主机/容器中运行(无论在何处执行此Python代码)。
sudo pip3 install docker==2.7.0