用 ELK 分析與儲存 log 紀錄   2019-08-20


Preface

Before, we installed and configured the rsyslog , MySQL and LogAnalyzer.
Now we use anothor tools to help us collect logs , and get better statistics.

Objective

Deploy the ELK and get statistics to analyze logs.

System environvent

Transport iptables log file to log server
Same environment in this note,and install the ELK in log server.

Install ELK

Install Java-OpenSDK

1
yum install java

Download ElasticSearch and extract it

1
2
3
4
wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-7.3.0-linux-x86_64.tar.gz
tar -zxvf elasticsearch-7.3.0-linux-x86_64.tar.gz
cd elasticsearch
bin/elasticsearch #run to install

modify config file from config/elasticsearch.yml

1
2
3
4
cluster.name: log-elasticsearch
network.host: $SERVER_IP
http.port: 9200
discovery.seed_hosts:["127.0.0.1","[::1]","[$SERVER_IP]"]

Download Kibana and extract it

1
2
3
4
wget https://artifacts.elastic.co/downloads/kibana/kibana-7.3.0-linux-x86_64.tar.gz
tar -zxvf kibana-7.3.0-linux-x86_64.tar.gz
cd kibana/config
vim kibana.yml

modify config file from config/kibana.yml

1
2
3
server.port: 5601
server.host: $SERVER_NAME
elasticsearch.hosts: ["http://$elasticsearch_SERVER_IP:9200"]

Download Logstash and extract it

1
2
wget https://artifacts.elastic.co/downloads/logstash/logstash-7.3.0.tar.gz
tar -zxvf logstash-7.3.0.tar.gz

Transport method

Transport rsyslog log files through LogStash

Logstash config file

1
2
3
4
5
6
7
8
9
imput{
syslog{
port => "514"
}
}
output{
elasticsearch{hosts => ["$Elasticsearch_SERVER:9200"]}
stdout{}
}

start up logstash

1
bin/logstash -f config/syslog.conf

Transport log files through Filebeat

Download filebeat plugin

1
2
https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.3.0-linux-x86_64.tar.gz
tar -zxvf filebeat-7.3.0-linux-x86_64.tar.gz

Modify filebeat config from ./filebeat.yml

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
#=====Filebeat inputs=====
filebeat.inputs:
-type: log
enabled: true
paths:
-/var/log/*.log

#=====Kibana=====
setup.kibana:
host: "192.168.0.250:5601"

#=====Outputs=====

#-----ElasticSearch output-----
output.elasticsearch:
hosts: ["$elasticsearch_SERVER_IP:9200"]

Browse kibana/discover then can show log in screen.

Some Errors

Insufficient space for shared memory file

clean the disk.

1
2
3
4
df -h
du -h -x --max-depth=1
ps aux
kill

Create Kibana index pattern forbidden

1
curl -XPUT -H "Content-Type: application/json" http://localhost:9200/_all/_settings -d '{"index.blocks.read_only_allow_delete": null}'

Screenshot

log from rsyslog client through logstash

log from local through filebeat

Reference

ELK 常用架构及使用场景介绍
集中式日志系统 ELK 协议栈详解
ELK 錯誤訊息 max file descriptors [4096] for elasticsearch process is too low
elasticsearch 7 单机配置

目錄

  1. Preface
  2. Objective
  3. System environvent
  4. Install ELK
  5. Transport method
    1. Transport rsyslog log files through LogStash
    2. Transport log files through Filebeat
    3. Modify filebeat config from ./filebeat.yml
  6. Some Errors
    1. Insufficient space for shared memory file
    2. Create Kibana index pattern forbidden
  7. Screenshot
  8. Reference