By continuing to navigate on this website, you accept the use of cookies to serve you more relevant services & content.
For more information and to change the setting of cookies on your computer, please read our Cookie Policy.

Understanding Logstash Parsing Configurations and options

In This Tutorial we will learn to customize Logstash to parse any type of Log Files. Logstash helps us to  process logs and other event data from a variety of systems. It also Supports variable injection into elasticsearch and has 200+ plugins.

Logstash Configuration is divided into three sections:

input {
# input config options

}
filter{
# parsing options
}
output {
  # output options
}

In Input Section we configure how we input log files for ingestion, most popular options are lumberjack, file, elasticsearch, graphite

In Ouput Section we configure on what happens to parsed lines in filter section.

In FIlter Section we parse the events. Sample Logstash FIlter Config to ingest syslog events.

filter{
    grok {
match => { "message" => "%{TIMESTAMP_ISO8601:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
    }
}

Analysis on Config:

syslog event:  May 18 11:24:30 Jagadeesh-PC /usr/lib/gdm3/gdm-x-session[8693]: Successfully activated service 'org.gnome.Terminal'

this event will be parsed as

syslog_timestamp ===>  May 18 11:24:30
syslog_hostname  ===>  Jagadeesh-PC
syslog_program ===> usr/lib/gdm3/gdm-x-session 
syslog_pid            ===> 8693
syslog_message  ===>  Successfully activated service 'org.gnome.Terminal

TIMESTAMP_ISO8601, SYSLOGHOST, POSINT, GREEDYDATA, DATA are all pattern matchers available in grok.

You are using match to match the log event, you can use add_field, add_tag to pass extra information while storing or you can use this snippet to overwite whole message and store.

  grok {
    match => { "message" => "%{SYSLOGBASE} %{DATA:message}" }
    overwrite => [ "message" ]
  }

You can use other patterns of grok like IPORHOST, HTTPDATE, USERNAME, INT ..etc.,  to parse apache/nginx files. If pattern is not available with grok, you can build your own custom pattern matchers or processing.

Many other plugins like Json, csv, kv, metrics ..etc., are available for parsing Logstash events.

 

    Posted On
  • 22 January 2016
  • By
  • Micropyramid

Need any Help in your Project?Let's Talk

Latest Comments
Related Articles
HTTP2 and SPDY Protocols - Make HTTP Faster and Safer

HTTP/2, next version of HTTP/1, http/1 can not handle the present web which has become more resource intensive, it cannot processes multiple requests in an ...

Continue Reading...
Django hosting on Nginx with uwsgi for high performance

Quick guide to host django application using uwsgi and nginx which offers better resource utilization and performance.

Continue Reading...
Web Hooks for Gitlab using PHP and Shell Scripts

Web-hooks play vital role if you are in Continuous Integration(CI). Higher Level organizations follow GitLab for CI purposes if they operate on open source solutions ...

Continue Reading...
open source packages

Subscribe To our news letter

Subscribe and Stay Updated about our Webinars, news and articles on Django, Python, Machine Learning, Amazon Web Services, DevOps, Salesforce, ReactJS, AngularJS, React Native.
* We don't provide your email contact details to any third parties