ELK Stack for Parsing your Logs- Part 2

In Previous Tutorial we looked into Setting up EL Server which can ingest your Syslog files. In this Post, you will learn about pushing your Log Files to EL Server and How we will Display it in Kibana with Interactive Graphs.

Setting Up Log Shipper:

To Ship your Logs we will be using Elastic's log stash - forwarder. Switch to either VPS or PC from which logs have to be shipped.

echo 'deb http://packages.elasticsearch.org/logstashforwarder/debian stable main' | sudo tee /etc/apt/sources.list.d/logstashforwarder.list
wget -O - http://packages.elasticsearch.org/GPG-KEY-elasticsearch | sudo apt-key add -
sudo apt-get update
sudo apt-get install logstash-forwarder

Configure Logstash Forwarder :
Now copy the Logstash server's SSL certificate(which you generated in the previous tutorial) into the appropriate location (/etc/pki/tls/certs):

sudo mkdir -p /etc/pki/tls/certs

Copy the certificate generated in the server to the log stash shipping  client. On Client Server, create and edit Logstash Forwarder configuration file, which is in JSON format:

sudo nano /etc/logstash-forwarder.conf

Under the network section, add the following lines to the file, substituting in your Logstash Server's private IP address for logstash_server_private_IP:

   "servers": [ "logstash_server_private_IP:5000" ],
    "timeout": 15,
    "ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt"

Under the files section (between the square brackets), add the following lines,

   {
      "paths": [
        "/var/log/syslog",
        "/var/log/auth.log"
       ],
      "fields": { "type": "syslog" }
    }

Save and Restart log stash-forwarder will start shipping your log files to Logstash Server.

Installing Kibana:

Now Switch Back to server where you installed EL Stack and Download Kibana 4 to your home directory with the following command:

cd ~; wget https://download.elasticsearch.org/kibana/kibana/kibana-4.0.1-linux-x64.tar.gz

Extract Kibana archive with tar:

tar xvf kibana-*.tar.gz

Open the Kibana configuration file for editing:

nano ~/kibana-4*/config/kibana.yml

In the Kibana configuration file, find the line that specifies host, and replace the IP address ("0.0.0.0" by default) with "localhost":
Save and exit. 

This setting makes it so Kibana will only be accessible to the localhost. This is fine because we will use a Nginx reverse proxy to allow external access.

Let's copy the Kibana files to a more appropriate location. Create the /opt directory with the following command:

sudo mkdir -p /opt/kibana

Now copy the Kibana files into your newly created directory:

sudo cp -R ~/kibana-4*/* /opt/kibana/

Install Supervisor using below command

sudo apt-get install supervisor

Add these lines to nano /etc/supervisor/supervisord.conf 

[program:kibana]
logfile=/var/log/supervisor/kibana.log ;
command= /opt/kibana/bin/kibana
autostart=true
autorestart=true

Kibana is set to running. Now configuring with nginx 
server {
    listen 80;
    server_name log.fcawitech.com;

    location / {
        proxy_pass http://localhost:5601;
        proxy_set_header Host $host;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

    }
}

Restart Nginx with command

sudo service nginx restart.

Visualizations in Kibana:

Kibana interface is divided into four main sections:

  1. Discover
  2. Visualize
  3. Dashboard
  4. Settings

First Navigate to kibana settings. 

In Settings navigate to Indices section, there you will need to select indices pattern and a measure and save it

After saving you can go to Discover and navigate through your log lines and filter as per your query or you can switch to visualize menu and generate Realtime Graphs

Posted On 07 April 2015 By MicroPyramid


Need any Help in your Project?Let's Talk

Latest Comments
Related Articles
Configuring WordPress Blog as sub-directory alongside Django in Nginx

Using regular expressions within Nginx we can bind urls to particular application servers, below we will configure wordpress blog and django site to be deployed ...

Continue Reading...
Fabric - Learning part1

Using Fabric, we can configure Linux production server with uwsgi, nginx, vsftpd, mysql, mongodb, postfix, php, python tools, and other relevant pieces of software single-handedly ...

Continue Reading...
Basics of Linux File System Heirarchy

In 1994, FHS(File System Hierarchy Standard) was proposed, it describes the directory structure of UNIX and UNIX-like systems(Linux). Before FHS was proposed, there had been ...

Continue Reading...
open source packages

Subscribe To our news letter

Subscribe and Stay Updated about our Webinars, news and articles on Django, Python, Machine Learning, Amazon Web Services, DevOps, Salesforce, ReactJS, AngularJS, React Native.
* We don't provide your email contact details to any third parties