Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/334.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 负载平衡docker swarm_Python_Docker_Haproxy_Docker Swarm - Fatal编程技术网

Python 负载平衡docker swarm

Python 负载平衡docker swarm,python,docker,haproxy,docker-swarm,Python,Docker,Haproxy,Docker Swarm,我有一个docker swarm模式,有一个HAProxy容器和3个python web应用程序。带有HAProxy的容器是暴露端口80,应该负载平衡我的应用程序的3个容器(byleastconn) 这是我的docker compose.yml文件: version: '3' services: scraper-node: image: scraper ports: - 5000 volumes: - /profiles:/profiles

我有一个docker swarm模式,有一个HAProxy容器和3个python web应用程序。带有HAProxy的容器是暴露端口80,应该负载平衡我的应用程序的3个容器(by
leastconn

这是我的
docker compose.yml
文件:

version: '3'

services:
  scraper-node:
    image: scraper
    ports:
      - 5000
    volumes:
      - /profiles:/profiles
    command: >
      bash -c "
        cd src;
        gunicorn src.interface:app \
          --bind=0.0.0.0:5000 \
          --workers=1 \
          --threads=1 \
          --timeout 500 \
          --log-level=debug \
      "
    environment:
      - SERVICE_PORTS=5000
    deploy:
      replicas: 3
      update_config:
        parallelism: 5
        delay: 10s
      restart_policy:
        condition: on-failure
        max_attempts: 3
        window: 120s
    networks:
      - web

  proxy:
    image: dockercloud/haproxy
    depends_on:
      - scraper-node
    environment:
      - BALANCE=leastconn
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
    ports:
      - 80:80
    networks:
      - web

networks:
  web:
    driver: overlay
当我部署这个swarm(
docker stack deploy--compose file=docker-compose.yml scraper
)时,我得到了我所有的容器:

CONTAINER ID   IMAGE                COMMAND                  CREATED        STATUS          PORTS                       NAMES
245f4bfd1299   scraper:latest       "/docker-entrypoin..."   21 hours ago   Up 19 minutes   80/tcp, 5000/tcp, 8000/tcp  scraper_scraper-node.3.iyi33hv9tikmf6m2wna0cypgp
995aefdb9346   scraper:latest       "/docker-entrypoin..."   21 hours ago   Up 19 minutes   80/tcp, 5000/tcp, 8000/tcp  scraper_scraper-node.2.wem9v2nug8wqos7d97zknuvqb
a51474322583   scraper:latest       "/docker-entrypoin..."   21 hours ago   Up 19 minutes   80/tcp, 5000/tcp, 8000/tcp  scraper_scraper-node.1.0u8q4zn432n7p5gl93ohqio8e
3f97f34678d1   dockercloud/haproxy  "/sbin/tini -- doc..."   21 hours ago   Up 19 minutes   80/tcp, 443/tcp, 1936/tcp   scraper_proxy.1.rng5ysn8v48cs4nxb1atkrz73
INFO:haproxy:dockercloud/haproxy 1.6.6 is running outside Docker Cloud
INFO:haproxy:Haproxy is running in SwarmMode, loading HAProxy definition through docker api
INFO:haproxy:dockercloud/haproxy PID: 6
INFO:haproxy:=> Add task: Initial start - Swarm Mode
INFO:haproxy:=> Executing task: Initial start - Swarm Mode
INFO:haproxy:==========BEGIN==========
INFO:haproxy:Linked service: scraper_scraper-node
INFO:haproxy:Linked container: scraper_scraper-node.1.0u8q4zn432n7p5gl93ohqio8e, scraper_scraper-node.2.wem9v2nug8wqos7d97zknuvqb, scraper_scraper-node.3.iyi33hv9tikmf6m2wna0cypgp
INFO:haproxy:HAProxy configuration:
global
  log 127.0.0.1 local0
  log 127.0.0.1 local1 notice
  log-send-hostname
  maxconn 4096
  pidfile /var/run/haproxy.pid
  user haproxy
  group haproxy
  daemon
  stats socket /var/run/haproxy.stats level admin
  ssl-default-bind-options no-sslv3
  ssl-default-bind-ciphers ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:ECDHE-ECDSA-AES256-SHA:AES128-GCM-SHA256:AES128-SHA256:AES128-SHA:AES256-GCM-SHA384:AES256-SHA256:AES256-SHA:DHE-DSS-AES128-SHA:DES-CBC3-SHA
defaults
  balance leastconn
  log global
  mode http
  option redispatch
  option httplog
  option dontlognull
  option forwardfor
  timeout connect 5000
  timeout client 50000
  timeout server 50000
listen stats
  bind :1936
  mode http
  stats enable
  timeout connect 10s
  timeout client 1m
  timeout server 1m
  stats hide-version
  stats realm Haproxy\ Statistics
  stats uri /
  stats auth stats:stats
frontend default_port_80
  bind :80
  reqadd X-Forwarded-Proto:\ http
  maxconn 4096
  default_backend default_service
backend default_service
  server scraper_scraper-node.1.0u8q4zn432n7p5gl93ohqio8e 10.0.0.5:5000 check inter 2000 rise 2 fall 3
  server scraper_scraper-node.2.wem9v2nug8wqos7d97zknuvqb 10.0.0.6:5000 check inter 2000 rise 2 fall 3
  server scraper_scraper-node.3.iyi33hv9tikmf6m2wna0cypgp 10.0.0.7:5000 check inter 2000 rise 2 fall 3
INFO:haproxy:Launching HAProxy
INFO:haproxy:HAProxy has been launched(PID: 12)
INFO:haproxy:===========END===========
当我显示
haproxy
容器日志时,他似乎识别出3个python容器:

CONTAINER ID   IMAGE                COMMAND                  CREATED        STATUS          PORTS                       NAMES
245f4bfd1299   scraper:latest       "/docker-entrypoin..."   21 hours ago   Up 19 minutes   80/tcp, 5000/tcp, 8000/tcp  scraper_scraper-node.3.iyi33hv9tikmf6m2wna0cypgp
995aefdb9346   scraper:latest       "/docker-entrypoin..."   21 hours ago   Up 19 minutes   80/tcp, 5000/tcp, 8000/tcp  scraper_scraper-node.2.wem9v2nug8wqos7d97zknuvqb
a51474322583   scraper:latest       "/docker-entrypoin..."   21 hours ago   Up 19 minutes   80/tcp, 5000/tcp, 8000/tcp  scraper_scraper-node.1.0u8q4zn432n7p5gl93ohqio8e
3f97f34678d1   dockercloud/haproxy  "/sbin/tini -- doc..."   21 hours ago   Up 19 minutes   80/tcp, 443/tcp, 1936/tcp   scraper_proxy.1.rng5ysn8v48cs4nxb1atkrz73
INFO:haproxy:dockercloud/haproxy 1.6.6 is running outside Docker Cloud
INFO:haproxy:Haproxy is running in SwarmMode, loading HAProxy definition through docker api
INFO:haproxy:dockercloud/haproxy PID: 6
INFO:haproxy:=> Add task: Initial start - Swarm Mode
INFO:haproxy:=> Executing task: Initial start - Swarm Mode
INFO:haproxy:==========BEGIN==========
INFO:haproxy:Linked service: scraper_scraper-node
INFO:haproxy:Linked container: scraper_scraper-node.1.0u8q4zn432n7p5gl93ohqio8e, scraper_scraper-node.2.wem9v2nug8wqos7d97zknuvqb, scraper_scraper-node.3.iyi33hv9tikmf6m2wna0cypgp
INFO:haproxy:HAProxy configuration:
global
  log 127.0.0.1 local0
  log 127.0.0.1 local1 notice
  log-send-hostname
  maxconn 4096
  pidfile /var/run/haproxy.pid
  user haproxy
  group haproxy
  daemon
  stats socket /var/run/haproxy.stats level admin
  ssl-default-bind-options no-sslv3
  ssl-default-bind-ciphers ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:ECDHE-ECDSA-AES256-SHA:AES128-GCM-SHA256:AES128-SHA256:AES128-SHA:AES256-GCM-SHA384:AES256-SHA256:AES256-SHA:DHE-DSS-AES128-SHA:DES-CBC3-SHA
defaults
  balance leastconn
  log global
  mode http
  option redispatch
  option httplog
  option dontlognull
  option forwardfor
  timeout connect 5000
  timeout client 50000
  timeout server 50000
listen stats
  bind :1936
  mode http
  stats enable
  timeout connect 10s
  timeout client 1m
  timeout server 1m
  stats hide-version
  stats realm Haproxy\ Statistics
  stats uri /
  stats auth stats:stats
frontend default_port_80
  bind :80
  reqadd X-Forwarded-Proto:\ http
  maxconn 4096
  default_backend default_service
backend default_service
  server scraper_scraper-node.1.0u8q4zn432n7p5gl93ohqio8e 10.0.0.5:5000 check inter 2000 rise 2 fall 3
  server scraper_scraper-node.2.wem9v2nug8wqos7d97zknuvqb 10.0.0.6:5000 check inter 2000 rise 2 fall 3
  server scraper_scraper-node.3.iyi33hv9tikmf6m2wna0cypgp 10.0.0.7:5000 check inter 2000 rise 2 fall 3
INFO:haproxy:Launching HAProxy
INFO:haproxy:HAProxy has been launched(PID: 12)
INFO:haproxy:===========END===========
但是当我尝试
获取
http://localhost
我收到一条错误消息:

<html>
  <body>
    <h1>503 Service Unavailable</h1>
    No server is available to handle this request.
  </body>
</html>

503服务不可用
没有可用于处理此请求的服务器。

503错误通常意味着对后端服务器的运行状况检查失败

在这里,您的统计信息页面可能会有所帮助:如果您将鼠标悬停在某个停机后端服务器的
LastChk
列上,HAProxy将为您提供该服务器停机原因的模糊摘要:

看起来您没有为
默认\u服务
后端配置运行状况检查(
选项httpchk
):您可以直接访问任何后端服务器(例如
curl--head 10.0.0.5:5000
)吗?发件人:

[R] 响应2xx和3xx为 被认为是有效的,而所有其他的都表示服务器出现故障,包括 没有任何回应


有两个问题:

  • docker compose.yml
    文件中的
    命令应为一行
  • scraper
    图像应显示端口5000(在他的
    Dockerfile
    中)
  • 一旦我解决了这些问题,我就以同样的方式部署这个集群(使用
    堆栈
    ),并且
    代理
    容器可以识别python容器并能够在它们之间实现负载平衡