Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/google-app-engine/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
使用docker使用fast data dev将Kafka主题连接到elasticsearch_Docker_<img Src="//i.stack.imgur.com/RUiNP.png" Height="16" Width="18" Alt="" Class="sponsor Tag Img">elasticsearch_Apache Kafka_Docker Compose - Fatal编程技术网 elasticsearch,apache-kafka,docker-compose,Docker,elasticsearch,Apache Kafka,Docker Compose" /> elasticsearch,apache-kafka,docker-compose,Docker,elasticsearch,Apache Kafka,Docker Compose" />

使用docker使用fast data dev将Kafka主题连接到elasticsearch

使用docker使用fast data dev将Kafka主题连接到elasticsearch,docker,elasticsearch,apache-kafka,docker-compose,Docker,elasticsearch,Apache Kafka,Docker Compose,我将使用和elasticsearch latest,kibana latest将数据从kafka发送到elasticsearch。但我得到了以下错误: org.apache.kafka.connect.errors.ConnectException: Couldn't start ElasticsearchSinkTask due to connection error: at io.confluent.connect.elasticsearch.jest.JestElasticsearchCl

我将使用和elasticsearch latest,kibana latest将数据从kafka发送到elasticsearch。但我得到了以下错误:

org.apache.kafka.connect.errors.ConnectException: Couldn't start ElasticsearchSinkTask due to connection error: at io.confluent.connect.elasticsearch.jest.JestElasticsearchClient.<init>(JestElasticsearchClient.java:160) at io.confluent.connect.elasticsearch.jest.JestElasticsearchClient.<init>(JestElasticsearchClient.java:144)
    at io.confluent.connect.elasticsearch.ElasticsearchSinkTask.start(ElasticsearchSinkTask.java:74)
    at io.confluent.connect.elasticsearch.ElasticsearchSinkTask.start(ElasticsearchSinkTask.java:48)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:304)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:195)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:184)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:234)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748) Caused by: io.searchbox.client.config.exception.CouldNotConnectException: Could not connect to http://127.0.0.1:9200
    at io.searchbox.client.http.JestHttpClient.execute(JestHttpClient.java:73)
    at io.searchbox.client.http.JestHttpClient.execute(JestHttpClient.java:63)
    at io.confluent.connect.elasticsearch.jest.JestElasticsearchClient.getServerVersion(JestElasticsearchClient.java:274)
    at io.confluent.connect.elasticsearch.jest.JestElasticsearchClient.<init>(JestElasticsearchClient.java:153)
    ... 12 more 
Caused by: org.apache.http.conn.HttpHostConnectException: Connect to 127.0.0.1:9200 [/127.0.0.1] failed: Connection refused (Connection refused)
    at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:159)
    at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:359)
    at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:381)
    at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
    at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
    at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
    at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:111)
    at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
    at io.searchbox.client.http.JestHttpClient.executeRequest(JestHttpClient.java:136)
    at io.searchbox.client.http.JestHttpClient.execute(JestHttpClient.java:70)
    ... 15 more
Caused by: java.net.ConnectException: Connection refused (Connection refused)
    at java.net.PlainSocketImpl.socketConnect(Native Method)
    at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    at java.net.Socket.connect(Socket.java:607)
    at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:75)
    at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
    ... 26 more
以下是elasticsearch的接收器连接器配置

connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
type.name=kafka-connect
tasks.max=1
topics=elastic
topic.index.map="elastic:index1"
topic.key.ignore=true
value.converter=org.apache.kafka.connect.json.JsonConverter
connection.url=http://127.0.0.1:9200
key.converter=org.apache.kafka.connect.json.JsonConverter
topic.schema.ignore=true
我想链接同一网络中的所有容器,这样我就不必处理连接问题,但似乎我遗漏了一些东西。 任务未运行。
任何建议都会很有帮助。

使用网络后,它将不再是本地主机。您需要将服务名称用作
connection.url
。你能试试
connection.url吗=http://elasticsearch:9200
也许没有
http

谢谢@Mustafa Guler的回答,它起作用了。请接受回答并结束问题。
connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
type.name=kafka-connect
tasks.max=1
topics=elastic
topic.index.map="elastic:index1"
topic.key.ignore=true
value.converter=org.apache.kafka.connect.json.JsonConverter
connection.url=http://127.0.0.1:9200
key.converter=org.apache.kafka.connect.json.JsonConverter
topic.schema.ignore=true