jueves, 7 de septiembre de 2017

Logstash Configuration for Weblogic

Probably the harder part to configure ELK (ElasticSearch, Logstash, Kibana) is to parse logs, get all fields correctly. I could not find a complete configuration to all types of logs of Weblogic for ELK, so i'm sharing mine.

Weblogic has four main different types of logs (server,  out, access, diagnostic),  every one with different format. I'm going to explain briefly the configuration of FileBeat and Logstash (for ElasticSearch and Kibana read their documentation Starting guide)

[update:14-08-2018] Added garbage collection logs patterns.

I'm sharing the configuration of Filebeat (as a first filter of logs), and logstash configuration (to parse the fields on the logs). Github repo https://github.com/carlgira/weblogic-elk
  1. Use Filebeat as first filter to logstash. The idea is that you can have lots of these small apps running in every machine where you have your logs, for later feed to an instance of logstash (or elasticsearch directly). My intention was to include:
    • The line that starts with one specific pattern (normaly the date)
    • Exclude all the "Notification" or "Info" logs. (don't want those on logstash)
    • Add a tag so logstash can identify what kind of log is (server,  out, access, diagnostic)
    • No stacktraces just error codes.
     2. The information is feed to logstash instance that identify the type of log and using a custom "grok" can identify all fields in message.
  • Logstash identify the type of log and apply a filter.
  • The grok pattern is configured to parse all the fields of every kind of event and format every field
  • Additional pattern file to control some other log structures
  • Taking special care to format log dates (some of them were in es-ES locale and got some problems with timezones)
  • Add a different prefix to all kinds of log so it can be easily identified in ElasticSearch/Kibana
    • "wls" for server logs
    • "out" for out logs
    • "diag" for diagnostic logs
    • "acc" for access logs
    •  "gcc" for garbage colletion logs

TEST

Clone the github repo https://github.com/carlgira/weblogic-elk, you can test the configuration with the example logs.
  • Download and install Filebeat, Logstash and ElasticSearch
  • Install the plugin logstash-filter-translate
    •  logstash-plugin install logstash-filter-translate
  • Start ElasticSearch
  • Start Logstash
    • logstash -f weblogic-logstash-pipeline.conf --config.reload.automatic
  • Start FileBeat
    • filebeat -c weblogic-filebeat.yml

Check the output in the console of logstash or connect a Kibana to your ElasticSearch to see the results.

Thats all