最便宜买机票网站建设,能不能同行网站做站长统计,怎么做填表网站,网站案例 网站建设ELK框架Logstash配合Filebeats和kafka使用 本文目录 ELK框架Logstash配合Filebeats和kafka使用配置文件结构input为标准输入#xff0c;output为标准输出input为log文件output为标准输出output为es input为tcpspringboot配置logstash配置 input为filebeatsfilebeats配置logsta…ELK框架Logstash配合Filebeats和kafka使用 本文目录 ELK框架Logstash配合Filebeats和kafka使用配置文件结构input为标准输入output为标准输出input为log文件output为标准输出output为es input为tcpspringboot配置logstash配置 input为filebeatsfilebeats配置logstash配置 input为kafkafilebeats设置logstash配置 配置文件结构
配置文件为logstash.yml
需要自己新建conf文件设置inputfilter和output文件结构如下自带的logstash-sample.conf内容如下
input { }
filter {}
output {}启动命令
bin/logstash -f config/logstash.confhttps://www.elastic.co/guide/en/logstash/current/input-plugins.html
input为标准输入output为标准输出 input { stdin { }
}
output {elasticsearch { hosts [localhost:9200] }stdout { }
}input为log文件
output为标准输出 input {# 从文件读取日志信息file {path /xxx/demolog/logs/myapp-info.logtype ghnstart_position beginning}}output {stdout { codec rubydebug }
}output为es input {# 从文件读取日志信息file {path /xxx/demolog/log/demolog-*.logtype ghnstart_position beginning}}output {# 输出到 elasticsearchelasticsearch {hosts [localhost:9200] user elasticpassword xxxxxxssl truecacert /xxx/elk/logstash-8.9.1/config/certs/http_ca.crtindex ghn-%{YYYY.MM.dd}}stdout { }
}input为tcp 配合springboot/springcloud使用
springboot配置
官方githubhttps://github.com/logfellow/logstash-logback-encoder 在pom.xml添加依赖 dependencygroupIdnet.logstash.logback/groupIdartifactIdlogstash-logback-encoder/artifactIdversion7.4/version/dependency在logback-spring.xml添加配置 appender namestash classnet.logstash.logback.appender.LogstashTcpSocketAppenderdestination127.0.0.1:4560/destination!-- encoder is required --encoder classnet.logstash.logback.encoder.LogstashEncoder //appenderroot levelinfoappender-ref refCONSOLE/appender-ref refstash //rootlogstash配置
input {# 从文件读取日志信息tcp {host 0.0.0.0mode serverport 4560codec json_lines}}output {# 输出到 elasticsearchelasticsearch {hosts [localhost:9200] user elasticpassword xxxxxxssl truecacert xxx/logstash-8.9.1/config/certs/http_ca.crtindex ghn-%{YYYY.MM.dd}}stdout { }# stdout { codec rubydebug }
}logstash终端查看 kibana查看
input为filebeats filebeats配置
配置文件位置filebeat-8.9.1-darwin-aarch64/filebeat.yml修改如下部分指定log位置为springboot的目录
filebeat.inputs:
- type: filestreamenabled: truepaths:- /xxx/xxx/*.log启动
./filebeat -e -c filebeat.yml -d publish与logstash建立了连接启动成功
logstash配置
input {beats {port 5044}
}output {elasticsearch {hosts [localhost:9200] user elasticpassword xxxxxxssl truecacert /xxxx/logstash-8.9.1/config/certs/http_ca.crtindex %{[metadata][beat]}-%{[metadata][version]}-%{YYYY.MM.dd}}
}启动logstash
bin/logstash -f config/logstash-filebeat.conf获取从filebeats发来的日志
kibana kibana中数据视图已经能够看到 查看详情
input为kafka 官方文档https://www.elastic.co/guide/en/logstash/current/use-filebeat-modules-kafka.html
filebeats设置
output.kafka:hosts: [localhost:9092]topic: filebeatcodec.json:pretty: false./filebeat -e -c filebeat.yml -d publishlogstash配置
input {kafka {bootstrap_servers [localhost:9092]topics [filebeat]codec json}}output {# 输出到 elasticsearchelasticsearch {hosts [localhost:9200] user elasticpassword xxxxxxssl truecacert /xxx/elk/logstash-8.9.1/config/certs/http_ca.crtindex ghn-%{YYYY.MM.dd}}stdout { }# stdout { codec rubydebug }
}kafka启动 logstash查看 kafka关闭
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html