ELK In Spring Boot
1. Introduction
1.1. What is the ELK Stack?
The ELK Stack is a collection of three open-source tools designed for searching, analyzing, and visualizing large amounts of log data in real-time. These tools are:
- Elasticsearch: A search and analytics engine used for storing, searching, and analyzing large volumes of data quickly. It provides distributed search and analytics capabilities.
- Logstash: A server-side data processing pipeline that ingests, transforms, and sends data (logs, metrics, etc.) to Elasticsearch. It supports various input sources, transformations, and output destinations.
- Kibana: A data visualization and exploration tool that sits on top of Elasticsearch, providing users with the ability to create charts, graphs, dashboards, and reports based on their log and metric data.
1.2. Why Use the ELK Stack?
- ● Centralized logging solution.
- ● Powerful search and analytics capabilities.
- ● Real-time log monitoring with Kibana dashboards.
- ● Easy integration with microservices architecture.
2. Overview of Spring Boot Logging
2.1. Logging in Spring Boot
● Spring Boot uses SLF4J and Logback by default for logging.
● Logs are generated for application events like startup, HTTP requests,
exceptions, etc.
2.2. Common Logging Use Cases
● Debugging application behavior.
● Monitoring application performance.
● Detecting and analyzing errors or exceptions.
3. Integrating the ELK Stack in Spring Boot
3.1. Prerequisites
● Install Elasticsearch, Logstash, and Kibana on your system.
● Spring Boot application with logging configured (using Logback, for
example).
- Elastic Search download : CLICK HERE
- Logstash Download : CLICK HERE
- Kibana Download : CLICK HERE
4. Step-by-Step Implementation
4.1. Logback Configuration for Centralized Logging
● Create a logback-spring.xml file in your Spring Boot project’s and add the below code :
src/main/resources/ directory.
Microservice
├── src
│ └── main
│ └── resources
│ └── logback-spring.xml
logback-spring.xml :-
<configuration> <appender name=”COMMON_FILE” class=”ch.qos.logback.core.rolling.RollingFileAppender”>
<file>C:/Users/syste/Desktop/LogFile/spring.log</file>
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss} %-5level [%thread] %logger{36} – %msg%n</pattern>
</encoder>
<rollingPolicy class=”ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy”> <fileNamePattern>C:/Users/syste/Desktop/LogFile/spring-%d{yyyy-MM-dd}.%i.log</fileNamePattern>
<maxFileSize>10MB</maxFileSize>
<maxHistory>30</maxHistory>
</rollingPolicy>
</appender>
<root level=”INFO”>
<appender-ref ref=”COMMON_FILE” />
</root>
</configuration>
4.2. Configuring Elasticsearch
- Install Elasticsearch and open elastic search batch file :
Elastic-Search
├── bin
│ └── elasticsearch(Batch File)
├── config
- After opening the batch file you will get the password and token on the terminal.
4.3. Setting Up Logstash
- Install Logstash and we have a configuration file to read logs and send them to Elasticsearch.
- So, in logstash folder go to config folder and edit logstash-sample.conf file and add below code.
- Here, we are adding path of log file and name of index that will be created in kibana.
Logstash
├── bin
├── config
│ └── logstash-sample.conf
Example Logstash Configuration (logstash-sample.conf) :
# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.
input {
file {
path => “C:/Users/syste/Desktop/LogFile/spring.log”
start_position => “beginning”
} }
output {
stdout {
codec => rubydebug }
elasticsearch {
hosts => [“https://localhost:9200”]
ssl_certificate_verification => false
ssl => true
index => “elkdemoindex”
user => “elastic”
password => “_6rEGUuL=cZjxSBzaajq” } }
4.4. Visualizing Logs in Kibana
- Install Kibana and and open the batch file of kibana.
- Then, you will get the port number of kibana like shown below in the picture.
- Open that port into browser.
Kibana
├── bin
│ └── kibana(Batch File)
├── config
- After going to the port add the token that is created at the time of elastic search then put the username which is by default elastic user and password present on the terminal.
- Navigate to Stack Management:
- From the Kibana dashboard, go to Stack Management.
- Go to Index Management:
- In the Stack Management panel, navigate to Index Management. Here, you will see the index created by Logstash (in our case, it should be something like elkdemoindex, as mentioned in the Logstash config file).
- Create a Data View:
- After confirming that the index has been created, go to Discover from the main menu.
- In Discover, create a new data view (previously called index pattern) by clicking Create Data View.
- In the Data View creation screen:
- Use spring-logs-* as the pattern.
- Select @timestamp as the time filter field.
- Visualizing Logs:
- Once the Data View is created, you can go to Discover to view the logs from your Spring Boot application.
- In Discover, you can filter and search through the logs to analyze them.
You can also create visualizations such as bar charts or line graphs using Visualize or Dashboard sections to monitor your application’s performance and behavior.
5. Advantages of Using ELK in Spring Boot Projects
- Centralized Log Management:
- Collect logs from multiple microservices into a single, searchable index.
- Powerful Search and Analytics:
- Elasticsearch allows fast, full-text searches across logs.
- Kibana provides powerful visualizations, helping with real-time monitoring.
- Scalability:
- ELK stack can scale with your microservices architecture.
- Error Detection and Debugging:
- Easy identification of errors, exceptions, and performance issues.
6. Conclusion
- The ELK Stack offers an effective solution for centralized logging in Spring Boot microservices.
- Through Elasticsearch, Logstash, and Kibana, you can implement a scalable, efficient logging infrastructure.
- With real-time log visualization and analysis, it’s easier to monitor application health and debug issues.
Also read : Python Basics 2024