Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/docker/10.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
使用python通过Rest代理消费Kafka_Python_Docker_Rest_Apache Kafka_Kafka Consumer Api - Fatal编程技术网

使用python通过Rest代理消费Kafka

使用python通过Rest代理消费Kafka,python,docker,rest,apache-kafka,kafka-consumer-api,Python,Docker,Rest,Apache Kafka,Kafka Consumer Api,我通过docker使用kafka环境。它正确地上升了 但是我不能用我的python脚本执行REST查询 我正在尝试读取拖缆上收到的所有消息 有什么改正的建议吗 很抱歉输出太长,我想详细说明问题以便于调试:) consumer.py #!/usr/bin/python # -*- coding: utf-8 -*- import requests import base64 import json import sys REST_PROXY_URL = 'http://localhost:808

我通过docker使用kafka环境。它正确地上升了

但是我不能用我的python脚本执行REST查询

我正在尝试读取拖缆上收到的所有消息

有什么改正的建议吗

很抱歉输出太长,我想详细说明问题以便于调试:)

consumer.py

#!/usr/bin/python
# -*- coding: utf-8 -*-
import requests
import base64
import json
import sys

REST_PROXY_URL = 'http://localhost:8082'
CONSUMER_INSTACE = 'zabbix_consumer'
NAME_TOPIC = 'zabbix'

def delete_consumer(BASE_URI):
    '''Delete the consumer'''

    headers = {'Accept': 'application/vnd.kafka.v2+json'}
    r = requests.delete(BASE_URI, headers=headers)

def create_consumer():
    '''Create the Consumer instance'''

    delete_consumer(f'{REST_PROXY_URL}/consumers/{CONSUMER_INSTACE}/instances/{CONSUMER_INSTACE}')
    PAYLOAD = {'format': 'json', 'name': f'{CONSUMER_INSTACE}', 'auto.offset.reset': 'earliest'}
    HEADERS = {'Content-Type': 'application/vnd.kafka.v2+json'}
    r = requests.post(f'{REST_PROXY_URL}/consumers/{CONSUMER_INSTACE}', data=json.dumps(PAYLOAD),
                      headers=HEADERS)
    if r.status_code != 200:
        print('Status Code: ' + str(r.status_code))
        print(r.text)
        sys.exit('Error thrown while creating consumer')
    return r.json()['base_uri']

def get_messages():
    '''Get the messages from the consumer'''
    
    BASE_URI = create_consumer()
    HEADERS = {'Accept': 'application/vnd.kafka.v2+json'}
    r = requests.get(BASE_URI + f'/topics/{NAME_TOPIC}', headers=HEADERS, timeout=30)
    if r.status_code != 200: 
        print('Status Code: ' + str(r.status_code))
        print(r.text)
        sys.exit('Error thrown while getting message')
    for message in r.json():
        if message['key'] is not None:
            print('Message Key:' + base64.b64decode(message['key']))
        print('Message Value:' + base64.b64decode(message['value']))

if __name__ == '__main__':
    get_messages()
输出

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 159, in _new_conn
    conn = connection.create_connection(
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 61, in create_connection
    for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
  File "/usr/lib/python3.8/socket.py", line 918, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno -3] Temporary failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 665, in urlopen
    httplib_response = self._make_request(
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 387, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/lib/python3.8/http/client.py", line 1255, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1301, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1250, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.8/http/client.py", line 1010, in _send_output
    self.send(msg)
  File "/usr/lib/python3.8/http/client.py", line 950, in send
    self.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 187, in connect
    conn = self._new_conn()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 171, in _new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f0650a9be80>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 439, in send
    resp = conn.urlopen(
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 719, in urlopen
    retries = retries.increment(
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 436, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='rest-proxy', port=8082): Max retries exceeded with url: /consumers/zabbix_consumer/instances/zabbix_consumer/topics/zabbix (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f0650a9be80>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "consumer.py", line 48, in <module>
    get_messages()
  File "consumer.py", line 37, in get_messages
    r = requests.get(BASE_URI + f'/topics/{NAME_TOPIC}', headers=HEADERS, timeout=30)
  File "/usr/lib/python3/dist-packages/requests/api.py", line 75, in get
    return request('get', url, params=params, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/api.py", line 60, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 533, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 646, in send
    r = adapter.send(request, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 516, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='rest-proxy', port=8082): Max retries exceeded with url: /consumers/zabbix_consumer/instances/zabbix_consumer/topics/zabbix (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f0650a9be80>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))
docker compose

---
version: '2'
services:
  zookeeper-1:
    image: confluentinc/cp-zookeeper:latest
    ports:
      - 2181:2181
    environment:
      ZOOKEEPER_SERVER_ID: 1
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
      ZOOKEEPER_INIT_LIMIT: 5
      ZOOKEEPER_SYNC_LIMIT: 2
      ZOOKEEPER_SERVERS: localhost:22888:23888
    network_mode: host

  kafka-1:
    image: confluentinc/cp-kafka:latest
    ports:
      - 9092:9092
    network_mode: host
    depends_on:
      - zookeeper-1
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: localhost:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092

  schema-registry:
    image: confluentinc/cp-schema-registry:latest
    network_mode: host
    hostname: schema-registry
    container_name: schema-registry
    depends_on:
      - zookeeper-1
      - kafka-1
    ports:
      - 8081:8081
    environment:
      SCHEMA_REGISTRY_HOST_NAME: schema-registry
      SCHEMA_REGISTRY_KAFKASTORE_CONNECTION_URL: localhost:2181
      SCHEMA_REGISTRY_ACCESS_CONTROL_ALLOW_ORIGIN: '*'
      SCHEMA_REGISTRY_ACCESS_CONTROL_ALLOW_METHODS: 'GET,POST,PUT,OPTIONS'

  rest-proxy:
    image: confluentinc/cp-kafka-rest:latest
    network_mode: host
    depends_on:
      - zookeeper-1
      - kafka-1
      - schema-registry
    ports:
      - 8082:8082
    hostname: rest-proxy
    container_name: rest-proxy
    environment:
      KAFKA_REST_HOST_NAME: rest-proxy
      KAFKA_REST_BOOTSTRAP_SERVERS: localhost:9092
      KAFKA_REST_LISTENERS: http://localhost:8082
      KAFKA_REST_SCHEMA_REGISTRY_URL: http://schema-registry:8081

只需使用kafka python包

pip install kafka-python
然后像这样订阅它:

from kafka import KafkaConsumer
import json

consumer = KafkaConsumer(
    "your-topic",
    bootstrap_servers = "your-kafka-server",
    group_id = "your-consumer-group",
    auto_offset_reset='earliest',
    enable_auto_commit=True,
    value_deserializer=lambda x: json.loads(x.decode('utf-8')))

for record in consumer:
    data = record.value
请注意,用户组的成员数不能超过分区数。否则,他们将挂起而不获取任何数据


不用说,
value\u反序列化程序
取决于将数据插入主题时如何序列化数据。

您的操作系统是什么?Ubuntu 20.04 lts一旦使用主机网络模式,您就可以尝试使用localhost而不是“rest proxy”作为主机(仅用于测试)。顺便说一句,我不太明白你在这里想要实现什么,你是在试图阅读某个主题的信息吗?如果是,为什么不使用卡夫卡软件包?另外,您的卡夫卡不应该链接到本地卷以具有持久性吗?我正在尝试从该主题读取消息。使用Kafka的REST代理不是很好的做法吗?不,您可以直接使用Kafka python包中的Kafka consumer,它更简单、更方便,我将在下面为您发布一些示例代码。非常感谢您的帮助!
from kafka import KafkaConsumer
import json

consumer = KafkaConsumer(
    "your-topic",
    bootstrap_servers = "your-kafka-server",
    group_id = "your-consumer-group",
    auto_offset_reset='earliest',
    enable_auto_commit=True,
    value_deserializer=lambda x: json.loads(x.decode('utf-8')))

for record in consumer:
    data = record.value