Log management is a crucial aspect of any application, especially as it grows in complexity. In mature applications with significant workloads, tracking events within the system becomes essential to prevent catastrophic situations and system crashes. This is where log management comes into play, encompassing tasks like storing, rotating, filtering, visualizing, and analyzing logs.
composer create-project laravel/laravel elk-log-management
cd elk-log-management
Custom Log Formatter: To ensure the logs are in a format suitable for storage in Elasticsearch, we need to create a custom log formatter. Create a new class within the app/Services/CustomFormatter.php file:
<?php
namespace App\Services;
use Monolog\Formatter\NormalizerFormatter;
use Monolog\LogRecord;
class CustomFormatter extends NormalizerFormatter
{
public function format(LogRecord $record)
{
$result = parent::format($record);
$result['app_name'] = env('APP_NAME');
$result['@timestamp'] = $this->normalize($record->datetime); // Required for Kibana
// Add any other custom properties you need
return $this->toJson($result) . "\n";
}
}
Logging Configuration: Next, modify the config/logging.php file to configure the desired logging channel. In this example, we'll focus on the daily channel:
<?php
use App\Services\CustomFormatter;
return [
// ... other channels ...
'daily' => [
'driver' => 'daily',
'path' => storage_path('logs/laravel.log'),
'level' => env('LOG_LEVEL', 'debug'),
'days' => 14,
'replace_placeholders' => true,
'formatter' => CustomFormatter::class,
],
];
Environment Variables: Finally, update the .env file with these configurations:
APP_NAME=ELK-log-manager
LOG_CHANNEL=daily
Docker Compose: Create a directory named elk in the root of your Laravel application. Inside the elk directory, create a docker-compose.yml file with the following configuration:
version: "3.8"
name: laravel-log-manager
networks:
elk-network:
driver: bridge
volumes:
elastic-data-vl:
services:
elasticsearch:
image: elasticsearch:8.11.1
container_name: elasticsearch
restart: always
volumes:
- elastic-data-vl:/usr/share/elasticsearch/data/
environment:
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
bootstrap.memory_lock: true
discovery.type: single-node
xpack.license.self_generated.type: basic
xpack.security.enabled: false
ports:
- "9200:9200"
- "9300:9300"
networks:
- elk-network
Logstash:
image: logstash:8.11.1
container_name: logstash
restart: always
volumes:
- ./logstash/:/logstash_dir
command: logstash -f /logstash_dir/logstash.conf
depends_on:
- elasticsearch
ports:
- "5044:5044"
- "9600:9600"
environment:
LS_JAVA_OPTS: "-Xmx256m -Xms256m"
networks:
- elk-network
Kibana:
image: kibana:8.11.1
container_name: kibana
restart: always
ports:
- "5601:5601"
environment:
- ELASTICSEARCH_URL=http://elasticsearch:9200
depends_on:
- elasticsearch
networks:
- elk-network
filebeat:
image: elastic/filebeat:8.11.1
container_name: filebeat
user: root
platform: linux/amd64
volumes:
- ./filebeat/filebeat.yml:/usr/share/filebeat/filebeat.yml
- ../storage/logs:/var/log/ELK-log-manager # Ensure this path matches the log file location
environment:
- monitoring.enabled= true
depends_on:
- Logstash
- elasticsearch
command: ["--strict.perms=false"]
stdin_open: true
tty: true
deploy:
mode: global
logging:
driver: 'json-file'
options:
max-size: '12m'
max-file: "100"
networks:
- elk-network
Filebeat Configuration: Inside the elk directory, create a subdirectory called filebeat. Inside filebeat, create a filebeat.yml file with the following configuration:
filebeat.inputs:
- type: log
enable: true
paths:
- /var/log/ELK-log-manager/*.log # Ensure this path matches the log file location
output.logstash:
hosts: ["logstash:5044"]
logging.json: true
logging.metrics.enable: false
logging:
files:
rotationeverybytes: 12582912
Logstash Configuration: In the elk directory, create another subdirectory named logstash. Inside logstash, create a logstash.conf file with the following configuration:
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} %{GREEDYDATA:message}" }
}
}
output {
elasticsearch {
hosts => ["http://elasticsearch:9200"]
index => "laravel-log-%{+YYYY.MM.dd}"
}
}
Log Messages: In any part of your Laravel application where you want to log events, use the Log facade:
Log::info('This is an informational message');
Log::warning('This is a warning message');
Log::error('This is an error message');
Exceptions: You can also throw exceptions to generate logs:
throw new \Exception('Something went wrong!');
Index Pattern: Access Kibana at http://localhost:5601/app/kibana#/management/kibana/index_patterns. Create a new index pattern based on the laravel-log-* index defined in your logstash.conf file. Select @timestamp as the time field.
0 comments:
Post a Comment