ELK Stack for Parsing your Logs- Part 2

Reading Time : ~ .

In Previous Tutorial we looked into Setting up EL Server which can ingest your Syslog files. In this Post, you will learn about pushing your Log Files to EL Server and How we will Display it in Kibana with Interactive Graphs.

Setting Up Log Shipper:

To Ship your Logs we will be using Elastic's log stash - forwarder. Switch to either VPS or PC from which logs have to be shipped.

echo 'deb http://packages.elasticsearch.org/logstashforwarder/debian stable main' | sudo tee /etc/apt/sources.list.d/logstashforwarder.list
wget -O - http://packages.elasticsearch.org/GPG-KEY-elasticsearch | sudo apt-key add -
sudo apt-get update
sudo apt-get install logstash-forwarder

Configure Logstash Forwarder :
Now copy the Logstash server's SSL certificate(which you generated in the previous tutorial) into the appropriate location (/etc/pki/tls/certs):

sudo mkdir -p /etc/pki/tls/certs

Copy the certificate generated in the server to the log stash shipping  client. On Client Server, create and edit Logstash Forwarder configuration file, which is in JSON format:

sudo nano /etc/logstash-forwarder.conf

Under the network section, add the following lines to the file, substituting in your Logstash Server's private IP address for logstash_server_private_IP:

   "servers": [ "logstash_server_private_IP:5000" ],
    "timeout": 15,
    "ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt"

Under the files section (between the square brackets), add the following lines,

   {
      "paths": [
        "/var/log/syslog",
        "/var/log/auth.log"
       ],
      "fields": { "type": "syslog" }
    }

Save and Restart log stash-forwarder will start shipping your log files to Logstash Server.

Installing Kibana:

Now Switch Back to server where you installed EL Stack and Download Kibana 4 to your home directory with the following command:

cd ~; wget https://download.elasticsearch.org/kibana/kibana/kibana-4.0.1-linux-x64.tar.gz

Extract Kibana archive with tar:

tar xvf kibana-*.tar.gz

Open the Kibana configuration file for editing:

nano ~/kibana-4*/config/kibana.yml

In the Kibana configuration file, find the line that specifies host, and replace the IP address ("0.0.0.0" by default) with "localhost":
Save and exit. 

This setting makes it so Kibana will only be accessible to the localhost. This is fine because we will use a Nginx reverse proxy to allow external access.

Let's copy the Kibana files to a more appropriate location. Create the /opt directory with the following command:

sudo mkdir -p /opt/kibana

Now copy the Kibana files into your newly created directory:

sudo cp -R ~/kibana-4*/* /opt/kibana/

Install Supervisor using below command

sudo apt-get install supervisor

Add these lines to nano /etc/supervisor/supervisord.conf 

[program:kibana]
logfile=/var/log/supervisor/kibana.log ;
command= /opt/kibana/bin/kibana
autostart=true
autorestart=true

Kibana is set to running. Now configuring with nginx 
server {
    listen 80;
    server_name log.fcawitech.com;

    location / {
        proxy_pass http://localhost:5601;
        proxy_set_header Host $host;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

    }
}

Restart Nginx with command

sudo service nginx restart.

Visualizations in Kibana:

Kibana interface is divided into four main sections:

  1. Discover
  2. Visualize
  3. Dashboard
  4. Settings

First Navigate to kibana settings. 

In Settings navigate to Indices section, there you will need to select indices pattern and a measure and save it

After saving you can go to Discover and navigate through your log lines and filter as per your query or you can switch to visualize menu and generate Realtime Graphs

    By Posted On
SENIOR DEVELOPER at MICROPYRAMID

Need any Help in your Project?Let's Talk

Latest Comments
Related Articles
Ansible Galaxy Introduction. Dinesh Deshmukh

Ansible Galaxy is the hub of ansible scripts contributed by users. To follow this article its important that you know about ansible. We have a ...

Continue Reading...
Automate Django Deployments with fabfile Shirisha Gaddi

Fabric is a Python library and command-line tool with the ability to execute commands on a remote server. It is designed to use SSH to ...

Continue Reading...
How to Deploy Django Project into Docker Container. Jagadeesh V

Docker, has captured the hearts and minds of the Devops community, with its platform for delivering distributed applications in containers. In this Blog Post, Lets ...

Continue Reading...

Subscribe To our news letter

Subscribe to our news letter to receive latest blog posts into your inbox. Please fill your email address in the below form.
*We don't provide your email contact details to any third parties