Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache kafka Kafka发送到azure事件中心_Apache Kafka_Azure Eventhub_Apache Kafka Mirrormaker - Fatal编程技术网

Apache kafka Kafka发送到azure事件中心

Apache kafka Kafka发送到azure事件中心,apache-kafka,azure-eventhub,apache-kafka-mirrormaker,Apache Kafka,Azure Eventhub,Apache Kafka Mirrormaker,我已经在我的机器中设置了一个kafka,我正在尝试设置Mirror Maker,以便从本地主题进行消费并将其镜像到azure事件中心,但到目前为止,我无法执行此操作,因此出现以下错误: ERROR Error when sending message to topic dev-eh-kafka-test with key: null, value: 5 bytes with error: (org.apache.kafka.clients.producer.internals.ErrorLogg

我已经在我的机器中设置了一个kafka,我正在尝试设置Mirror Maker,以便从本地主题进行消费并将其镜像到azure事件中心,但到目前为止,我无法执行此操作,因此出现以下错误:

ERROR Error when sending message to topic dev-eh-kafka-test with key: null, value: 5 bytes with error: (org.apache.kafka.clients.producer.internals.ErrorLoggingCallback)
org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.
过了一段时间,我意识到这一定是producer部分,所以我尝试简单地直接使用kafka控制台producer工具到event hub,并得到了相同的错误

这是我的制作人设置文件:

bootstrap.servers=dev-we-eh-feed.servicebus.windows.net:9093
compression.type=none
max.block.ms=0
# for event hub
sasl.mechanism=PLAIN
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://dev-we-eh-feed.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=*****”;
以下是旋转生产者的命令:

kafka-console-producer.bat --broker-list dev-we-eh-feed.servicebus.windows.net:9093 --topic dev-eh-kafka-test
我的事件中心命名空间有一个名为dev eh kafka test的事件中心

有人能做到吗?最终的想法是使用证书来SSL,但首先我需要能够进行连接

我尝试使用Apacha Kafka 1.1.1或融合的Kafka 4.1.3(因为这是客户端正在使用的版本)

==更新1

有人告诉我如何获取更多日志,这似乎是错误的详细版本

[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Initialize connection to node dev-we-eh-feed.servicebus.windows.net:9093 (id: -1 rack: null) for sending metadata request (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Initiating connection to node dev-we-eh-feed.servicebus.windows.net:9093 (id: -1 rack: null) (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Created socket with SO_RCVBUF = 32768, SO_SNDBUF = 102400, SO_TIMEOUT = 0 to node -1 (org.apache.kafka.common.network.Selector)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Completed connection to node -1. Fetching API versions. (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Initiating API versions fetch from node -1. (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Connection with dev-we-eh-feed.servicebus.windows.net/51.144.238.23 disconnected (org.apache.kafka.common.network.Selector)
java.io.EOFException
        at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:124)
        at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:93)
        at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:235)
        at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:196)
        at org.apache.kafka.common.network.Selector.attemptRead(Selector.java:559)
        at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:495)
        at org.apache.kafka.common.network.Selector.poll(Selector.java:424)
        at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:460)
        at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:239)
        at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:163)
        at java.base/java.lang.Thread.run(Thread.java:830)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient)

下面是有效的配置(似乎我缺少client.id)

此外,似乎您无法选择目标主题,似乎它必须与源同名

bootstrap.servers=dev-we-eh-feed.servicebus.windows.net:9093
client.id=mirror_maker_producer
request.timeout.ms=60000
sasl.mechanism=PLAIN
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://dev-we-eh-feed.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=******";

顺便问一下,为什么MirrorMaker2不行?你能
nc-vz dev-we-eh-feed.servicebus.windows.net 9093
?您确定端口已打开且地址正确吗?@cricket_007,这是客户端强制要求的,这里没有选项。我确信它是可以访问的,因为我可以远程登录。如果你使用MirrorMaker2,你可以修改目的地主题谢谢你的信息。不幸的是,我被这件事困住了:(为什么?您应该可以单独下载较新的卡夫卡版本/脚本,而无需升级卡夫卡,@FEST您是否成功地将数据从本地卡夫卡发送到Azure event hub,您是否有一些文档要遵循?@OneCricketer是否有用于Azure Eventhub的卡夫卡接收器连接器,我需要将数据从卡夫卡发送到Azu重新事件中心。