Friday, October 4, 2024

How to Create Log Manager with Logstash, Kibana and ElasticSearch

Log management is a crucial aspect of any application, especially as it grows in complexity. In mature applications with significant workloads, tracking events within the system becomes essential to prevent catastrophic situations and system crashes. This is where log management comes into play, encompassing tasks like storing, rotating, filtering, visualizing, and analyzing logs.

One powerful and widely adopted approach to log management is the ELK stack (Elasticsearch, Logstash, and Kibana). In this article, we will explore the integration of the ELK stack with a Laravel application.

Step 1: Setting Up the Laravel Project and Configuration

First, create a new Laravel project using Composer:

      composer create-project laravel/laravel elk-log-management
cd elk-log-management
    

Now, let's make some essential modifications to the Laravel application.

  • Custom Log Formatter: To ensure the logs are in a format suitable for storage in Elasticsearch, we need to create a custom log formatter. Create a new class within the app/Services/CustomFormatter.php file:

<?php

namespace App\Services;

use Monolog\Formatter\NormalizerFormatter;
use Monolog\LogRecord;

class CustomFormatter extends NormalizerFormatter
{
    public function format(LogRecord $record)
    {
        $result = parent::format($record);
        $result['app_name'] = env('APP_NAME');
        $result['@timestamp'] = $this->normalize($record->datetime); // Required for Kibana
        
        // Add any other custom properties you need
        
        return $this->toJson($result) . "\n";
    }
}
    

This custom formatter extends the NormalizerFormatter and adds a few key properties, such as the application name (app_name) and the timestamp (@timestamp) formatted for Kibana.

  • Logging Configuration: Next, modify the config/logging.php file to configure the desired logging channel. In this example, we'll focus on the daily channel:

<?php

use App\Services\CustomFormatter;

return [
    // ... other channels ...

    'daily' => [
        'driver' => 'daily',
        'path' => storage_path('logs/laravel.log'),
        'level' => env('LOG_LEVEL', 'debug'),
        'days' => 14,
        'replace_placeholders' => true,
        'formatter' => CustomFormatter::class,
    ],
];
    

We've set the formatter property to use our custom CustomFormatter class.

  • Environment Variables: Finally, update the .env file with these configurations:

      APP_NAME=ELK-log-manager
LOG_CHANNEL=daily
    

Step 2: ELK Infrastructure Setup and Configuration

Now, let's set up and configure the ELK infrastructure. We will use Docker to streamline the process.

  • Docker Compose: Create a directory named elk in the root of your Laravel application. Inside the elk directory, create a docker-compose.yml file with the following configuration:

version: "3.8"
name: laravel-log-manager

networks:
  elk-network:
    driver: bridge

volumes:
  elastic-data-vl:

services:
  elasticsearch:
    image: elasticsearch:8.11.1
    container_name: elasticsearch
    restart: always
    volumes:
      - elastic-data-vl:/usr/share/elasticsearch/data/
    environment:
      ES_JAVA_OPTS: "-Xmx256m -Xms256m"
      bootstrap.memory_lock: true
      discovery.type: single-node
      xpack.license.self_generated.type: basic
      xpack.security.enabled: false
    ports:
      - "9200:9200"
      - "9300:9300"
    networks:
      - elk-network

  Logstash:
    image: logstash:8.11.1
    container_name: logstash
    restart: always
    volumes:
      - ./logstash/:/logstash_dir
    command: logstash -f /logstash_dir/logstash.conf
    depends_on:
      - elasticsearch
    ports:
      - "5044:5044"
      - "9600:9600"
    environment:
      LS_JAVA_OPTS: "-Xmx256m -Xms256m"
    networks:
      - elk-network

  Kibana:
    image: kibana:8.11.1
    container_name: kibana
    restart: always
    ports:
      - "5601:5601"
    environment:
      - ELASTICSEARCH_URL=http://elasticsearch:9200
    depends_on:
      - elasticsearch
    networks:
      - elk-network

  filebeat:
    image: elastic/filebeat:8.11.1
    container_name: filebeat
    user: root
    platform: linux/amd64
    volumes:
      - ./filebeat/filebeat.yml:/usr/share/filebeat/filebeat.yml
      - ../storage/logs:/var/log/ELK-log-manager # Ensure this path matches the log file location
    environment:
      - monitoring.enabled= true
    depends_on:
      - Logstash
      - elasticsearch
    command: ["--strict.perms=false"]
    stdin_open: true
    tty: true
    deploy:
      mode: global
    logging:
      driver: 'json-file'
      options:
        max-size: '12m'
        max-file: "100"

    networks:
      - elk-network
    

This docker-compose.yml file defines the services for Elasticsearch, Logstash, Kibana, and Filebeat. It configures ports, volumes, dependencies, and network connections.

  • Filebeat Configuration: Inside the elk directory, create a subdirectory called filebeat. Inside filebeat, create a filebeat.yml file with the following configuration:

      filebeat.inputs:
- type: log
  enable: true
  paths:
    - /var/log/ELK-log-manager/*.log # Ensure this path matches the log file location

output.logstash:
  hosts: ["logstash:5044"]

logging.json: true
logging.metrics.enable: false

logging:
  files:
    rotationeverybytes: 12582912
    

This filebeat.yml file configures Filebeat to read logs from the specified path (make sure it matches the filebeat volume in docker-compose.yml) and send them to Logstash.

  • Logstash Configuration: In the elk directory, create another subdirectory named logstash. Inside logstash, create a logstash.conf file with the following configuration:

      input {
  beats {
    port => 5044
  }
}

filter {
  grok {
    match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} %{GREEDYDATA:message}" }
  }
}

output {
  elasticsearch {
    hosts => ["http://elasticsearch:9200"]
    index => "laravel-log-%{+YYYY.MM.dd}"
  }
}
    

This logstash.conf file defines the pipeline for processing logs received from Filebeat. It uses a grok filter to parse the logs and then sends them to Elasticsearch with a daily index pattern.

Step 3: Triggering Logs and Exceptions

Now, it's time to test the integration by sending logs and exceptions from your Laravel application.

  • Log Messages: In any part of your Laravel application where you want to log events, use the Log facade:

      Log::info('This is an informational message');
Log::warning('This is a warning message');
Log::error('This is an error message');
    
  • Exceptions: You can also throw exceptions to generate logs:

      throw new \Exception('Something went wrong!');
    

Step 4: Visualizing Logs in Kibana

After starting all the Docker containers (using docker-compose up -d), give the ELK stack some time (1-2 minutes) to establish connections and initialize.

  • Index Pattern: Access Kibana at http://localhost:5601/app/kibana#/management/kibana/index_patterns. Create a new index pattern based on the laravel-log-* index defined in your logstash.conf file. Select @timestamp as the time field.

Once the index pattern is created, you can start exploring your logs in Kibana, visualizing them with dashboards and using powerful search and analysis capabilities.

GitHub Repository

For detailed examples, additional configurations, and code snippets, please refer to the GitHub repository for this article.

Conclusion

This article demonstrated how to integrate the ELK stack into a Laravel application to effectively manage and analyze logs. By leveraging the power of Elasticsearch, Logstash, and Kibana, you gain a comprehensive platform for monitoring your application's health, performance, and security. The ELK stack empowers you to proactively identify issues, troubleshoot problems, and gain deeper insights into your application's behavior.

0 comments:

Post a Comment