Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/matlab/14.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache kafka 在合流Kafka Python客户端中指定复制因子_Apache Kafka_Confluent Platform_Confluent Kafka Python - Fatal编程技术网

Apache kafka 在合流Kafka Python客户端中指定复制因子

Apache kafka 在合流Kafka Python客户端中指定复制因子,apache-kafka,confluent-platform,confluent-kafka-python,Apache Kafka,Confluent Platform,Confluent Kafka Python,我已经设置了Kafka的单个代理实例,以及Zookeeper、Kafka工具、架构注册表和control-center。设置是使用docker compose和Confluent提供的图像完成的。docker compose的外观如下: zookeeper: image: confluentinc/cp-zookeeper:latest environment: ZOOKEEPER_CLIENT_PORT: 2181 ZOOKEEPER_TICK_

我已经设置了Kafka的单个代理实例,以及Zookeeper、Kafka工具、架构注册表和control-center。设置是使用docker compose和Confluent提供的图像完成的。docker compose的外观如下:

    zookeeper:
    image: confluentinc/cp-zookeeper:latest
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
    ports:
      - 2181:2181

  broker:
    image: confluentinc/cp-server:latest
    depends_on:
      - zookeeper
    ports:
      - 9092:9092
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: "zookeeper:2181"
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://broker:29092,PLAINTEXT_HOST://localhost:9092
      KAFKA_METRIC_REPORTERS: io.confluent.metrics.reporter.ConfluentMetricsReporter
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
      KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0
      KAFKA_CONFLUENT_LICENSE_TOPIC_REPLICATION_FACTOR: 1
      CONFLUENT_METRICS_REPORTER_BOOTSTRAP_SERVERS: broker:29092
      CONFLUENT_METRICS_REPORTER_ZOOKEEPER_CONNECT: zookeeper:2181
      CONFLUENT_METRICS_REPORTER_TOPIC_REPLICAS: 1
      CONFLUENT_METRICS_ENABLE: "true"
      CONFLUENT_SUPPORT_CUSTOMER_ID: "anonymous"

  kafka-tools:
    
    image: confluentinc/cp-kafka:latest
    hostname: kafka-tools
    container_name: kafka-tools
    command: ["tail", "-f", "/dev/null"]
    network_mode: "host"

  schema-registry:
    image: confluentinc/cp-schema-registry:5.5.0
    hostname: schema-registry
    container_name: schema-registry
    depends_on:
      - zookeeper
      - broker
    ports:
      - "8081:8081"
    environment:
      SCHEMA_REGISTRY_HOST_NAME: schema-registry
      SCHEMA_REGISTRY_KAFKASTORE_CONNECTION_URL: "zookeeper:2181"

  control-center:
    image: confluentinc/cp-enterprise-control-center:latest
    hostname: control-center
    container_name: control-center
    depends_on:
      - zookeeper
      - broker
      - schema-registry
    ports:
      - "9021:9021"
    environment:
      CONTROL_CENTER_BOOTSTRAP_SERVERS: 'broker:29092'
      CONTROL_CENTER_ZOOKEEPER_CONNECT: 'zookeeper:2181'
      CONTROL_CENTER_SCHEMA_REGISTRY_URL: "http://schema-registry:8081"
      CONTROL_CENTER_REPLICATION_FACTOR: 1
      CONTROL_CENTER_INTERNAL_TOPICS_PARTITIONS: 1
      CONTROL_CENTER_MONITORING_INTERCEPTOR_TOPIC_PARTITIONS: 1
      CONFLUENT_METRICS_TOPIC_REPLICATION: 1
      PORT: 9021
正在尝试合并kafka python客户端以创建生产者和消费者应用程序。以下是我的kafka生产者的代码:

from datetime import datetime
import os
import json
from uuid import uuid4
from confluent_kafka import SerializingProducer
from confluent_kafka.serialization import StringSerializer
from confluent_kafka.schema_registry import SchemaRegistryClient
from confluent_kafka.schema_registry.json_schema import JSONSerializer

class BotExecutionProducer(object):
    """
    Class represents the 
    Bot execution Stats
    """

    def __init__(self,ticketId,accountId,executionTime,status):

        self.ticketId = ticketId
        self.accountId = accountId
        self.executionTime = executionTime
        self.timestamp = str(datetime.now())
        self.status = status


    def botexecution_to_dict(self,botexecution,ctx):
        """
        Returns a Dict representation of the 
        KafkaBotExecution instance
        botexecution : KafkaBotExecution instance
        ctx: SerializaionContext

        """
        return dict(ticketId=self.ticketId,
                    accountId=self.accountId,
                    executionTime=self.executionTime,
                    timestamp=self.timestamp,
                    status=self.status
            )

    def delivery_report(self,err, msg):
        """
        Reports the failure or success of a message delivery.
        """
        if err is not None:
            print("Delivery failed for User record {}: {}".format(msg.key(), err))
            return
        print('User record {} successfully produced to {} [{}] at offset {}'.format(
            msg.key(), msg.topic(), msg.partition(), msg.offset()))

    def send(self):
        """
        Will connect to Kafka Broker
        validate and send the message
        """
        topic = "bots.execution"
        schema_str = """
            {
            "$schema": "http://json-schema.org/draft-07/schema#",
            "title": "BotExecutions",
            "description": "BotExecution Stats",
            "type": "object",
            "properties": {
                "ticketId": {
                "description": "Ticket ID",
                "type": "string"
                },
                "accountId": {
                "description": "Customer's AccountID",
                "type": "string"             
                },
                "executionTime": {
                "description": "Bot Execution time in seconds",
                "type": "number"
                },
                "timestamp": {
                "description": "Timestamp",
                "type": "string"
                },
                "status": {
                "description": "Execution Status",
                "type": "string"
                }
            },
            "required": [ "ticketId", "accountId", "executionTime", "timestamp", "status"]
            }
            """

        schema_registry_conf = {'url': 'http://localhost:8081'}

        schema_registry_client = SchemaRegistryClient(schema_registry_conf)

        json_serializer = JSONSerializer(schema_str,schema_registry_client,self.botexecution_to_dict)

        producer_conf = {
                    'bootstrap.servers': "localhost:9092",
                    'key.serializer': StringSerializer('utf_8'),
                    'value.serializer': json_serializer,
                    'acks': 0,
    }


        producer = SerializingProducer(producer_conf)
        print(f'Producing records to topic {topic}')

        producer.poll(0.0)

        try:
            print(self)
            producer.produce(topic=topic,key=str(uuid4()),partition=1,
                             value=self,on_delivery=self.delivery_report)

        except ValueError:
            print("Invalid Input,discarding record.....")
现在,当我执行代码时,它应该创建一个Kafka主题,并将JSON数据推送到该主题,但这似乎不起作用。它不断显示一个错误,即在只有一个代理的情况下指定了复制因子3。是否有方法在上述代码中定义复制因子。使用kafka cli执行相同操作时,kafka borker可以完美地工作。
有什么我遗漏的吗?

我不确定cp server和cp kafka映像之间的完全区别,但您可以为自动创建的主题的默认复制因子添加一个变量

KAFKA_DEFAULT_REPLICATION_FACTOR: 1

如果这不起作用,请导入并使用
AdminClient

您最好禁用自动主题创建并使用AdminClient创建主题。我已禁用Docker compose中的自动主题创建,并继续使用AdminClient创建主题,复制系数为1。主题正在创建中。当我运行Producer时,它会被执行,但不会推送任何消息。您缺少Producer.flush(),感谢这一点对我非常有用。获取此错误org.apache.kafka.common.errors.InvalidReplicationFactoreException:复制因子:3比可用代理大:1。broker|1[2020-11-03 05:18:27327]错误检查或创建度量主题(io.confluent.telemetry.exporter.kafka.KafkaExporter)broker|1 | org.apache.kafka.common.errors.invalidApplicationFactoreException:复制因子:3大于可用的代理:1.我假设这应该是复制因子,而不是副本?合流的\u度量\u报告者\u主题\u副本。。。如果不需要control center,还可以使用CONFLUENT_metrics_ENABLE禁用度量