jueves, 7 de septiembre de 2017

Logstash Configuration for Weblogic

Probably the harder part to configure ELK (ElasticSearch, Logstash, Kibana) is to parse logs, get all fields correctly. I could not find a complete configuration to all types of logs of Weblogic for ELK, so i'm sharing mine.

Weblogic has four main different types of logs (server,  out, access, diagnostic),  every one with different format. I'm going to explain briefly the configuration of FileBeat and Logstash (for ElasticSearch and Kibana read their documentation Starting guide)

[update:14-08-2018] Added garbage collection logs patterns.

I'm sharing the configuration of Filebeat (as a first filter of logs), and logstash configuration (to parse the fields on the logs). Github repo https://github.com/carlgira/weblogic-elk
  1. Use Filebeat as first filter to logstash. The idea is that you can have lots of these small apps running in every machine where you have your logs, for later feed to an instance of logstash (or elasticsearch directly). My intention was to include:
    • The line that starts with one specific pattern (normaly the date)
    • Exclude all the "Notification" or "Info" logs. (don't want those on logstash)
    • Add a tag so logstash can identify what kind of log is (server,  out, access, diagnostic)
    • No stacktraces just error codes.
     2. The information is feed to logstash instance that identify the type of log and using a custom "grok" can identify all fields in message.
  • Logstash identify the type of log and apply a filter.
  • The grok pattern is configured to parse all the fields of every kind of event and format every field
  • Additional pattern file to control some other log structures
  • Taking special care to format log dates (some of them were in es-ES locale and got some problems with timezones)
  • Add a different prefix to all kinds of log so it can be easily identified in ElasticSearch/Kibana
    • "wls" for server logs
    • "out" for out logs
    • "diag" for diagnostic logs
    • "acc" for access logs
    •  "gcc" for garbage colletion logs

TEST

Clone the github repo https://github.com/carlgira/weblogic-elk, you can test the configuration with the example logs.
  • Download and install Filebeat, Logstash and ElasticSearch
  • Install the plugin logstash-filter-translate
    •  logstash-plugin install logstash-filter-translate
  • Start ElasticSearch
  • Start Logstash
    • logstash -f weblogic-logstash-pipeline.conf --config.reload.automatic
  • Start FileBeat
    • filebeat -c weblogic-filebeat.yml

Check the output in the console of logstash or connect a Kibana to your ElasticSearch to see the results.

Thats all

6 comentarios:

Anónimo dijo...

Hi,

Thanks for this, it's a real help.
I've modified your code to work with English and STD format logs, once I've done testing do you mind (or can I) upload the files to your github? I'll prefix with EN_

Thanks

Dave

Jeferson Negrini dijo...

Hey guys!
Nice work!!
#daveshaw301 @daveshaw301 , have you finished your work with English format?
Tks alot!

Carlos Giraldo dijo...

@daveshaw301 sure we can upload those changes to my github.

Unknown dijo...

I really appreciate your work and very amazing and important information.
Black Friday Deals for Xbox

Unknown dijo...

I am getting below error when i start logstash with same configuration. any help much appreciated.
[2018-10-25T12:05:49,587][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-10-25T12:05:49,771][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#, @filter=[\"./patterns\"], match=>{\"message\"=>\"####<%{WLS_SERVERLOG_DATE:wls_timestamp}%{SPACE}%{DATA:wls_timezone}>%{SPACE}<%{LOGLEVEL:wls_level}>%{SPACE}<%{DATA:wls_subsystem}>%{SPACE}<%{DATA:wls_host}>%{SPACE}<%{DATA:wls_server}>%{SPACE}<%{DATA:wls_thread}>%{SPACE}<([<>a-zA-Z ]*)>%{SPACE}<%{DATA:wls_transactionid}>%{SPACE}<%{DATA:wls_diagcontid}>%{SPACE}<%{DATA:wls_rawtime}>%{SPACE}<%{DATA:wls_code}>%{SPACE}<%{GREEDYDATA:wls_message}\"}, id=>\"4f5f2bd9cdf0a0a1e22d07068c0299b6e3979ed0b8c38b53ea1baf82bad2d9a3\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"pattern %{WLS_SERVERLOG_DATE:wls_timestamp} not defined", :thread=>"#"}
[2018-10-25T12:05:49,777][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#, :backtrace=>["/u01/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in `block in compile'", "org/jruby/RubyKernel.java:1292:in `loop'", "/u01/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in `compile'", "/u01/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:in `block in register'", "org/jruby/RubyArray.java:1734:in `each'",

Unknown dijo...

"/u01/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:in `block in register'", "org/jruby/RubyHash.java:1343:in `each'", "/u01/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:in `register'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:242:in `register_plugin'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:253:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:253:in `register_plugins'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:595:in `maybe_setup_out_plugins'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:263:in `start_workers'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:200:in `run'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:160:in `block in start'"], :thread=>"#"}
[2018-10-25T12:05:49,797][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}
[2018-10-25T12:05:50,129][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}