今天开始搭建logstash 客户端日志收集工具,搭配上期做的实战,来实现ELK 日志收集系统,我这里主要做收集tomcat的日志,故logstash工具安装在跟Tomcat服务在一台的服务器上;

1.下载

我这里使用版本7.4.0 以下是下载地址:
https://artifacts.elastic.co/downloads/logstash/logstash-7.4.0.rpm

2.安装:

命令

  1. rpm -ivh logstash-7.4.0.rpm

3.修改配置文件

  1. #创建日志和数据目录
  2. # mkdir -p /data/logs/logstash
  3. # mkdir -p /data/logstash_data
  4. # chown -R logstash:logstash /data/logs/logstash/
  5. # chown -R logstash:logstash /data/logstash_data/
  6. # 修改如下内容为自己的内容
  7. [root@VMTest soft]# vi /etc/logstash/logstash.yml
  8. #配置时一定要执行:
  9. # chown -R logstash:logstash /data/logs/logstash/
  10. # chown -R logstash:logstash /data/logstash_data/
  11. path.data: /data/logstash_data
  12. path.logs: /data/logs/logstash

4.添加需要监听的日志文件

  1. #路径
  2. /etc/logstash/conf.d
  3. #添加配置文件
  4. 例如:
  5. -rw-r--r--. 1 root root 488 8 24 18:10 logstash-admin.conf
  6. -rw-r--r--. 1 root root 492 8 24 18:12 logstash-eureka.conf
  7. path: 日志监听的路径
  8. start_position:类型
  9. type:当多个配置文件的时候我们要分出谁是谁的日志,这个type就是分类进入ES索引中

image.pngimage.png

5.启动服务

正确的启动方式:

  1. [root@node1 logstash]# pwd
  2. /usr/share/logstash
  3. # -f /etc/logstash/conf.d/的意思是依赖conf.d下面的所有配置文件
  4. bin/logstash -f /etc/logstash/conf.d/

成功日志:

  1. [root@node1 logstash]# bin/logstash -f /etc/logstash/conf.d/
  2. Thread.exclusive is deprecated, use Thread::Mutex
  3. WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
  4. Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
  5. [WARN ] 2021-08-24 17:44:23.957 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
  6. [INFO ] 2021-08-24 17:44:23.987 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"7.4.0"}
  7. [INFO ] 2021-08-24 17:44:28.911 [Converge PipelineAction::Create<main>] Reflections - Reflections took 66 ms to scan 1 urls, producing 20 keys and 40 values
  8. [INFO ] 2021-08-24 17:44:31.370 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://172.10.20.35:9201/, http://172.10.20.56:9201/, http://172.10.20.59:9201/]}}
  9. [WARN ] 2021-08-24 17:44:31.817 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://172.10.20.35:9201/"}
  10. [INFO ] 2021-08-24 17:44:32.148 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7}
  11. [WARN ] 2021-08-24 17:44:32.163 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
  12. [WARN ] 2021-08-24 17:44:32.709 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://172.10.20.56:9201/"}
  13. [WARN ] 2021-08-24 17:44:32.936 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://172.10.20.59:9201/"}
  14. [INFO ] 2021-08-24 17:44:33.052 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//172.10.20.35:9201", "//172.10.20.56:9201", "//172.10.20.59:9201"]}
  15. [INFO ] 2021-08-24 17:44:33.157 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://172.10.20.35:9201/, http://172.10.20.56:9201/, http://172.10.20.59:9201/]}}
  16. [WARN ] 2021-08-24 17:44:33.194 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://172.10.20.35:9201/"}
  17. [INFO ] 2021-08-24 17:44:33.236 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7}
  18. [WARN ] 2021-08-24 17:44:33.246 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
  19. [INFO ] 2021-08-24 17:44:33.320 [Ruby-0-Thread-5: :1] elasticsearch - Using default mapping template
  20. [WARN ] 2021-08-24 17:44:33.393 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://172.10.20.56:9201/"}
  21. [INFO ] 2021-08-24 17:44:33.430 [Ruby-0-Thread-5: :1] elasticsearch - Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
  22. [WARN ] 2021-08-24 17:44:33.463 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://172.10.20.59:9201/"}
  23. [INFO ] 2021-08-24 17:44:33.479 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//172.10.20.35:9201", "//172.10.20.56:9201", "//172.10.20.59:9201"]}
  24. [INFO ] 2021-08-24 17:44:33.511 [Ruby-0-Thread-7: :1] elasticsearch - Using default mapping template
  25. [WARN ] 2021-08-24 17:44:33.687 [[main]-pipeline-manager] LazyDelegatingGauge - A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
  26. [INFO ] 2021-08-24 17:44:33.709 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x477f34a3 run>"}
  27. [INFO ] 2021-08-24 17:44:34.482 [[main]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_3f53c4419c0da8df2bf7fae30007a5ce", :path=>["/zlb/pf/8080-eureka/eureka.log"]}
  28. [INFO ] 2021-08-24 17:44:34.568 [[main]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_073e1c290c495b18ebbcb26f522ecfe1", :path=>["/zlb/pf/8083-admin/admin.log"]}
  29. [INFO ] 2021-08-24 17:44:34.606 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
  30. [INFO ] 2021-08-24 17:44:34.727 [[main]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
  31. [INFO ] 2021-08-24 17:44:34.728 [[main]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
  32. [INFO ] 2021-08-24 17:44:34.753 [Ruby-0-Thread-7: :1] elasticsearch - Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
  33. [INFO ] 2021-08-24 17:44:34.862 [Agent thread] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
  34. [INFO ] 2021-08-24 17:44:35.935 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}

6.效果展示:

我们在kibana展示:
首先我们创建索引模式,我们看到ES 索引已经自动创建了
image.png
tomcat日志已经写进
image.png
image.png

7.将 logstash 作为服务启动

在上面的启动中,都是直接通过命令 logstash 来启动的,其实可以通过修改 logstash.service 启动脚本来启动服务

[root@192.168.118.14 ~]#vim /etc/systemd/system/logstash.service
…
ExecStart=/usr/share/logstash/bin/logstash "--path.settings" "/etc/logstash" "-f" "/etc/logstash/conf.d"
…

启动服务

[root@192.168.118.14 ~]#systemctl daemon-reload
[root@192.168.118.14 ~]#systemctl start logstash

参考:https://www.cnblogs.com/hukey/p/11586254.html