Monday, May 10, 2021

ELK Stack for monitoring Microservice Log at a centralize location

 ELK stands for Elastic search DB, LogStash and Kibana. When ever we work with microservice architecture we always have a huge requirement to verify and observe log at the central place for quick diagnosis and quick understanding of our application behavior.

ELK framework gives this functionality where in we can centrally monitor all the logs of our Microservice from Kibana UI using browser.

So before going in depth lets try to understand what each component is

‘- Elastic Search:-
https://www.elastic.co/

Elasticsearch is a distributed, free and open search and analytics engine and NoSQL Database built on Apache Lucene which help us to store data i.e. all types of data, including textual, numerical, geospatial, structured, unstructured and logs.

‘- Logstash:-
https://www.elastic.co/logstash

Logstash is a free and open server-side data processing pipeline that ingests data from a multitude of sources, transforms it, and then sends it to your favorite “stash. i.e. it collects logs from one system perform some operation if needed i.e. extraction, formatting, deleting, adding and transfer it into the format i.e. json that is needed by the another server. So in short it is a log pipeline tool that accepts input/logs from various sources and export the data to various targets.

‘- Kibana :-
https://www.elastic.co/kibana

Kibana is a free and open user interface that lets you visualize your Elasticsearch data and navigate the Elastic Stack. Do anything from tracking query load to understanding the way requests flow through your apps. in short Kibana is visualization UI layer which helps developer to monitor application logs.

Now lets understand how this three things coming together help us in monitoring centralize logging in Microservice.

actual flow is

Microservice –> Produce Log—> We use Logstash to collect this log and process it as an input , apply format if required including extraction, deletion, addition of meta data if required –> Send this log into JSON format to Elastic Search DB–> Elastic Search DB store this data as NOSQL format –> Finally Kibanan is configured to read data from Elastic Search and display the data on the screen for centralize logging purpose.

In short

Microservice Produce log –> Logstach Process log –> Elasticsearch Store logs in JSON format –> Kibana used to visualize it on browser.

Now lets move on handson practise

In this example we are going to follow below steps religiously

1- Creating Spring boot microservice that will create log. We will create good and exception log with stag trace for good output in kibana.
2- We will setup logstash and configure it with the log files that is created by our Microservice application.
3- We will setup Elastic search to take logstash data and store the data in its NOSQL JSON format.
4- Finally we will setup Kibana to take data from Elastic search and display on screen.

You can download all the three items elastic search Db, logstash and kibana from below location. As we are using windows machine we are downloading windows version you can chose your own o/s version.
https://www.elastic.co/start
extract it and set the path till bin folder

You can download the kibana from below site
https://www.elastic.co/downloads/kibana
extract it and set the path till bin folder

You can download the Logstash from belwo site
https://www.elastic.co/downloads/logstash
extract it and set the path till bin folder

Now lets run all the three thing using blow commands

Starting ElasticSearch

C:\elasticsearch-7.12.1\bin>elasticsearch

Before staring Kibana update we need to update kibana.conf file inside folder
C:\kibana-7.12.1-windows-x86_64\config

Uncomment this line

The URLs of the Elasticsearch instances to use for all your queries.

elasticsearch.hosts: [“http://localhost:9200”%5D

This is because we want our kibana to talk to elasticsearch and it should know where elasticsearch is running on.

Now lets lets run kibana using below command
C:\Users\Siddhartha>kibana

Now lets create a simple springboot application that will generate some log and exception as given below.

Please follow the below step

1-ElkSpringbootExampleApplication

package com.siddhu;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RestController;

import java.util.List;
import java.util.stream.Collectors;
import java.util.stream.Stream;

@SpringBootApplication
@RestController
public class ElkSpringbootExampleApplication {
	
	Logger logger=LoggerFactory.getLogger(ElkSpringbootExampleApplication.class);

    @GetMapping("/getUser/{id}")
    public User getUserById(@PathVariable int id) {
		List<User> users=getUsers();
		User user=users.stream().
				filter(u->u.getId()==id).findAny().orElse(null);
		if(user!=null){
			logger.info("user found : {}",user);
			return user;
		}else{
			try {
				throw new Exception();
			} catch (Exception e) {
				e.printStackTrace();
				logger.error("Error User Not Found with this ID : {}",id);
			}
			return new User();
		}
    }


    private List<User> getUsers() {
        return Stream.of(new User(1, "user1"),
				new User(2, "user2"),
				new User(3, "user3"),
				new User(4, "user4"))
				.collect(Collectors.toList());
    }

	public static void main(String[] args) {
		SpringApplication.run(ElkSpringbootExampleApplication.class, args);
	}

}

2- User

package com.siddhu;

import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.ToString;

@AllArgsConstructor
@NoArgsConstructor
@ToString
@Data
public class User {
	
	public User()
	{
		
	}
	
	public User(int id, String name)
	{
		id= this.id;
		name= this.name;
	}

    public int getId() {
		return id;
	}
	public void setId(int id) {
		this.id = id;
	}
	public String getName() {
		return name;
	}
	public void setName(String name) {
		this.name = name;
	}
	public int id;
    public String name;
}

3- pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-parent</artifactId>
		<version>2.4.5</version>
		<relativePath/> <!-- lookup parent from repository -->
	</parent>
	<groupId>com.siddhu</groupId>
	<artifactId>elk-springboot-example</artifactId>
	<version>0.0.1-SNAPSHOT</version>
	<name>elk-springboot-example</name>
	<description>Demo project for Spring Boot Inegration with ELK Stack</description>
	<properties>
		<java.version>11</java.version>
	</properties>
	<dependencies>
		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-web</artifactId>
		</dependency>

		<dependency>
			<groupId>org.projectlombok</groupId>
			<artifactId>lombok</artifactId>
			<optional>true</optional>
		</dependency>
		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-test</artifactId>
			<scope>test</scope>
		</dependency>
	</dependencies>

	<build>
		<plugins>
			<plugin>
				<groupId>org.springframework.boot</groupId>
				<artifactId>spring-boot-maven-plugin</artifactId>				
			</plugin>
		</plugins>
	</build>

</project>

4- Make following changes in applcation.properties files
spring.application.name=ELK-Example
server.port=9898
logging.file.name=C:/springboot-log/elk-log.log

Now lets run our spring boot application

Now create an exception and check we got that in our log files i.e. :/springboot-log/elk-log.log

now lets configure this log file to our Kibana. As we know we are using logstash to feed the data in side Elasticsearch which is then taken by the Kibana. So we need to first tell Logstash where is our log files situated from which he need to take data and format it before inserting into Elasticsearch.

For this refer to the https://www.elastic.co/downloads/logstash

Here it says that Prepare a logstash.conf config file and Runlogstash using command bin/logstash -f logstash.conf

Lets do this now

Add following line our

1- C:\logstash-7.12.1\config\logstash.conf

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  file {
    path => "C:/springboot-log/elk-log.log"
	start_position => "beginning"
  }
}

output {

  stdout {
	codec => rubydebug
  }  
  elasticsearch {
    hosts => ["http://localhost:9200"]
    #index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    #user => "elastic"
    #password => "changeme"
  }
}

Now start logstash using command

C:\logstash-7.12.1\bin>logstash.bat -f ../config/logstash.conf

Now lets check the command on browser for logstash

Search for Indices that is created internally by the logstash

Now we know logstach is able to get the log properly from the log files we had defined now lets go to the kibana and create the indices with the same name as it is created in logstach i.e. logstash-2021.05.10-000001

Now click on discover

Download:- https://github.com/shdhumale/elk-springboot-example

Note: you can also refer to below site for more information.

https://www.elastic.co/what-is/elk-stack

No comments: