如何使用DeviceAntony/docker elk将CSV或JSON数据导入Elasticsearch
几天前我刚开始学习elasticsearch和docker,在将数据导入elasticsearch时遇到了一些问题。我使用的弹性堆栈回购如下所示: 我试图遵循我在网上找到的教程: 但在我加载kibana时找不到任何索引 这就是我所做的。 我下载了一个示例数据,并将其存储在根目录下名为data的文件夹中。 在docker-compose.yml文件中,我创建了一个指向外部数据文件夹的绑定挂载如何使用DeviceAntony/docker elk将CSV或JSON数据导入Elasticsearch,docker,
elasticsearch,docker-compose,logstash,kibana,Docker,
elasticsearch,Docker Compose,Logstash,Kibana,几天前我刚开始学习elasticsearch和docker,在将数据导入elasticsearch时遇到了一些问题。我使用的弹性堆栈回购如下所示: 我试图遵循我在网上找到的教程: 但在我加载kibana时找不到任何索引 这就是我所做的。 我下载了一个示例数据,并将其存储在根目录下名为data的文件夹中。 在docker-compose.yml文件中,我创建了一个指向外部数据文件夹的绑定挂载 elasticsearch: build: context: elasticsearc
elasticsearch:
build:
context: elasticsearch/
args:
ELK_VERSION: $ELK_VERSION
volumes:
- type: bind
source: ./elasticsearch/config/elasticsearch.yml
target: /usr/share/elasticsearch/config/elasticsearch.yml
read_only: true
- type: bind
source: ./data
target: /usr/share/elasticsearch/data
ports:
- "9200:9200"
- "9300:9300"
environment:
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
ELASTIC_PASSWORD: password
# Use single node discovery in order to disable production mode and avoid bootstrap checks
# see https://www.elastic.co/guide/en/elasticsearch/reference/current/bootstrap-checks.html
discovery.type: single-node
networks:
- elk
在我的logstash.conf文件下。这就是我所改变的:
input {
tcp {
port => 5000
}
file {
path => "/usr/share/elasticsearch/data/conn250K.csv"
start_position => "beginning"
}
}
filter {
csv {
columns => [ "record_id", "duration", "src_bytes", "dest_bytes" ]
}
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
user => "elastic"
password => "password"
index => "network"
}
}
在终端中启动“docker compose up”命令后,我在Kibana中找不到任何要创建的索引模式,因为没有生成索引。我无法找出问题所在。尝试将外部数据文件夹绑定到日志存储容器,而不是elasticsearch
logstash:
build:
context: logstash/
args:
ELK_VERSION: $ELK_VERSION
volumes:
- type: bind
source: ./logstash/config/logstash.yml
target: /usr/share/logstash/config/logstash.yml
read_only: true
- type: bind
source: ./logstash/pipeline
target: /usr/share/logstash/pipeline
read_only: true
- type: bind
source: ./data
target: /usr/share/logstash/data
read_only: true
ports:
- "5000:5000/tcp"
- "5000:5000/udp"
- "9600:9600"
environment:
LS_JAVA_OPTS: "-Xmx256m -Xms256m"
networks:
- elk
depends_on:
- elasticsearch