Monday, May 31, 2021

EFK FluentBit stack using springboot Microservice, Kubernetes and Docker

 As I am using Window machine and minikube was showing me 100% cpu utilization so i had used inbuild kubernetes from docker desktop with below configuations.

  • Configure EFK stack manually

Step 1: Create a Namespace
kubectl get namespaces
kubectl create -f siddhu-kube-logging.yaml
Step 2: Setup Elasticsearch
kubectl create -f siddhu-elastic-stack.yaml
kubectl get pod -n kube-logging
kubectl describe pod -n kube-logging
kubectl get pod -n kube-logging -o wide
kubectl port-forward 9200:9200 -n kube-logging
check this url is working http://localhost:9200
Step 3: Setup Kibana
kubectl create -f siddhu-kibana.yaml
kubectl port-forward 5601:5601 –namespace=kube-logging
check this url is working http://localhost:5601
Step 4: Fluent Bit Service
kubectl create -f siddhu-fluent-bit-service-account.yaml
kubectl create -f siddhu-fluent-bit-role.yaml
kubectl create -f siddhu-fluent-bit-role-binding.yaml
kubectl create -f siddhu-fluent-bit-configmap.yaml
kubectl create -f siddhu-fluent-bit-ds.yaml

Note: All the files can be optained from below github locations.
https://github.com/shdhumale/efk-springboot-docker-kubernetes-example.git

Finally access the logs of Kubernetes and Springboot application on Kibana UI screen

Now lets try to check the log created by our application spring boot applicaiton with EFK stack.

we are using VS Code as IDE for Springboot, Docker and Kubernetes.
1- Create a springboot application.
Note:you can download the ready made spring maven project from the below location.

1- EfkSpringbootDockerKubernetesExampleApplication.java

package com.siddhu;

import java.io.PrintWriter;
import java.io.StringWriter;
import java.util.Date;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;


@SpringBootApplication
@RestController
public class EfkSpringbootDockerKubernetesExampleApplication {

	Logger logger = LoggerFactory.getLogger(EfkSpringbootDockerKubernetesExampleApplication.class);
	//@RequestMapping(value = "/siddhu")
	@GetMapping("/siddhu")
	public String helloWorld() {
		String response = "Simple data message showing success call :- " + new Date();
		logger.info("response found : {}", response);

		return response;
	}

	//@RequestMapping(value = "/exception")
	@GetMapping("/exception")
	public String exception() {
		String response = "";
		try {
			throw new Exception("Displaying Exception :- ");
		} catch (Exception e) {
			e.printStackTrace();
			logger.error("Exception Created:",e);

			StringWriter sw = new StringWriter();
			PrintWriter pw = new PrintWriter(sw);
			e.printStackTrace(pw);
			String stackTrace = sw.toString();
			logger.error("Exception stackTrace- " + stackTrace);
			response = stackTrace;
		}

		return response;
	}

	public static void main(String[] args) {
		SpringApplication.run(EfkSpringbootDockerKubernetesExampleApplication.class, args);
	}

}

2- application.properties

spring.application.name=EFLK-Docker-Kubernetes-Example
server.port=9898
logging.file.name=C:/springboot-log/spring-boot-eflk.log
#logging.file.name= /var/log/containers/spring-boot-eflk.log

Try to build (clean install) and run the maven application and check it is creating the out put as needed.

2- Create a SpringBoot application Docker file

FROM java:8-jdk-alpine
COPY ./target/efk-springboot-docker-kubernetes-example-0.0.1-SNAPSHOT.jar /usr/app/
WORKDIR /usr/app
RUN sh -c 'touch efk-springboot-docker-kubernetes-example-0.0.1-SNAPSHOT.jar'
ENTRYPOINT ["java","-jar","efk-springboot-docker-kubernetes-example-0.0.1-SNAPSHOT.jar"]

3- Create a SpringBoot application Docker Image

Note Before making docker image change the log files line in application.properties as shown below.

logging.file.name= /var/log/containers/spring-boot-eflk.log

Now execute below command

docker build -t shdhumale/efk-springboot-docker-kubernetes .

Now lets run the docker images just created using below command.

docker run -p 9898:9898 shdhumale/efk-springboot-docker-kubernetes

execute
http://localhost:9898/siddhu and http://localhost:9898/exception

and you will be able to see the output properly.

Check if we are able to see the log in our desired folder using below command

docker ps

Take docker container id and run below command on it

docker exec -it /bin/sh

4- Upload the SpringBoot application Docker Image on Hub using below command

docker image push shdhumale/efk-springboot-docker-kubernetes

Note:- Lets check we are able to downlaod our image and run it locally

docker pull shdhumale/efk-springboot-docker-kubernetes

5- Finally make the following changes

a:- Deploy our spring boot application using kubernetes.
For that we need to prepare first our deployment yaml file give below

1- siddhu-springboot.yaml

apiVersion: apps/v1
kind: Deployment
metadata:
  name: siddhuspringboot
  namespace: kube-logging
spec:
  selector:
    matchLabels:
      component: siddhuspringboot
  template:
    metadata:
      labels:
        component: siddhuspringboot
    spec:
      containers:
      - name: siddhuspringboot
        image: shdhumale/efk-springboot-docker-kubernetes:latest
        env:
        - name: discovery.type
          value: single-node
        ports:
        - containerPort: 9898
          name: http
          protocol: TCP      

---

apiVersion: v1
kind: Service
metadata:
  name: siddhuspringboot
  namespace: kube-logging
  labels:
    service: siddhuspringboot
spec:
  type: NodePort
  selector:
    component: siddhuspringboot
  ports:
  - port: 9898
    targetPort: 9898
	

b:- execute this file with below command.

kubectl create -f siddhu-springboot.yaml

C:\STS-Workspace\efk-springboot-docker-kubernetes-example\ymlfile>kubectl create -f siddhu-springboot.yaml
deployment.apps/siddhuspringboot created
service/siddhuspringboot created

check if the pod is deployed properly

Now check if our application is working properly using forward port command.

kubectl port-forward 9898:9898 -n kube-logging

Hit the url and check we are able to get the output as expected.

http://localhost:9898/exception and http://localhost:9898/siddhu

Now check in our kibana that we are able to get the log of our spring boot application

Download :- https://github.com/shdhumale/efk-springboot-docker-kubernetes-example.git

Monday, May 17, 2021

EFK Stack using FluentBit

In this example we will use the FluentBit to collect the CPU stats and store it in ElasticSearch db and finally have it on Kiabana UI for monitoring.

We are using Window machine you will need to download respective package for fluentbit as per your O/S version.

Step 1:- Install fluentBit. You can download the same from below location.

https://docs.fluentbit.io/manual/installation/windows

Add following to your application and system path

C:\Program Files\td-agent-bit\bin

Now lets start the fluentbit using below command.

C:\Program Files\td-agent-bit\bin\fluent-bit.exe -i dummy -o stdout

you will be able to see the belwo output. This indicate that your fluentbit is running.

C:\Users\Siddhartha>fluent-bit.exe -i dummy -o stdout

Fluent Bit v1.7.4

* Copyright (C) 2019-2021 The Fluent Bit Authors

* Copyright (C) 2015-2018 Treasure Data

* Fluent Bit is a CNCF sub-project under the umbrella of Fluentd

* https://fluentbit.io

[2021/05/17 16:34:09] [ info] [engine] started (pid=8064)

[2021/05/17 16:34:09] [ info] [storage] version=1.1.1, initializing...

[2021/05/17 16:34:09] [ info] [storage] in-memory

[2021/05/17 16:34:09] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128

[2021/05/17 16:34:09] [ info] [sp] stream processor started

[0] dummy.0: [1621249450.913351900, {"message"=>"dummy"}]

[1] dummy.0: [1621249451.914558700, {"message"=>"dummy"}]

[2] dummy.0: [1621249452.915630300, {"message"=>"dummy"}]

[3] dummy.0: [1621249453.916200700, {"message"=>"dummy"}]

[4] dummy.0: [1621249454.917336500, {"message"=>"dummy"}]

[0] dummy.0: [1621249455.905589500, {"message"=>"dummy"}]

[1] dummy.0: [1621249456.906112000, {"message"=>"dummy"}]

[2] dummy.0: [1621249457.906381400, {"message"=>"dummy"}]

[3] dummy.0: [1621249458.907409500, {"message"=>"dummy"}]

[4] dummy.0: [1621249459.907748200, {"message"=>"dummy"}]

In window if you want to check what input plugin you have use the below command

C:\Users\Siddhartha>fluent-bit --help

Inputs

  tail                  Tail files

  dummy                 Generate dummy data

  statsd                StatsD input plugin

  winlog                Windows Event Log

  tcp                   TCP

  forward               Fluentd in-forward

  random                Random

  Now lets try to tail the log file using fluentbit

we have our log file at belwo location  C:/springboot-log/spring-boot-eflk.txt execute this command from the prompt

C:\Users\Siddhartha>fluent-bit fluent-bit -i tail -p path=C:/springboot-log/spring-boot-eflk.txt -o stdout

Fluent Bit v1.7.4

* Copyright (C) 2019-2021 The Fluent Bit Authors

* Copyright (C) 2015-2018 Treasure Data

* Fluent Bit is a CNCF sub-project under the umbrella of Fluentd

* https://fluentbit.io

[2021/05/17 16:37:29] [ info] [engine] started (pid=9892)

[2021/05/17 16:37:29] [ info] [storage] version=1.1.1, initializing...

[2021/05/17 16:37:29] [ info] [storage] in-memory

[2021/05/17 16:37:29] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128

[2021/05/17 16:37:29] [ info] [sp] stream processor started

Or you can also update fluent-bit.conf  by adding follwing line.

[INPUT]

    Name        tail

    Path        C:/springboot-log/spring-boot-eflk.log


[OUTPUT]

    Name   stdout

    Match  *

and now execute this command.

C:\Users\Siddhartha>fluent-bit -c C:\fluent-bit-conf-files\fluent-bit.conf

Fluent Bit v1.7.4

* Copyright (C) 2019-2021 The Fluent Bit Authors

* Copyright (C) 2015-2018 Treasure Data

* Fluent Bit is a CNCF sub-project under the umbrella of Fluentd

* https://fluentbit.io

[2021/05/17 16:40:23] [ info] [engine] started (pid=1704)

[2021/05/17 16:40:23] [ info] [storage] version=1.1.1, initializing...

[2021/05/17 16:40:23] [ info] [storage] in-memory

[2021/05/17 16:40:23] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128

[2021/05/17 16:40:23] [ info] [sp] stream processor started


Now lets change the log using the application and lets check we are able to see the output on the console for the log.

hit below url to generate log in our C:/springboot-log/spring-boot-eflk.txt

http://localhost:9898/siddhu

http://localhost:9898/exception

You will be able to see the change in the prompt as shown below


for http://localhost:9898/siddhu

C:\Users\Siddhartha>fluent-bit  -c C:\fluent-bit-conf-files\fluent-bit.conf -R C:\fluent-bit-conf-files\parsers.conf

Fluent Bit v1.7.4

* Copyright (C) 2019-2021 The Fluent Bit Authors

* Copyright (C) 2015-2018 Treasure Data

* Fluent Bit is a CNCF sub-project under the umbrella of Fluentd

* https://fluentbit.io

[2021/05/17 17:21:31] [error] [parser] parser named 'apache' already exists, skip.

[2021/05/17 17:21:31] [ info] [engine] started (pid=5680)

[2021/05/17 17:21:31] [ info] [storage] version=1.1.1, initializing...

[2021/05/17 17:21:31] [ info] [storage] in-memory

[2021/05/17 17:21:31] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128

[2021/05/17 17:21:31] [ info] [sp] stream processor started

[0] tail.0: [1621252302.456784000, {"log"=>"2021-05-17 17:21:42.438  INFO 7444 --- [http-nio-9898-exec-8] c.s.ElkBeatSpringbootExampleApplication  : response found : Simple data message showing success call :- Mon May 17 17:21:42 IST 2021"}]

For http://localhost:9898/exception

and hit the below url

http://localhost:9200/_cat/indices?v&pretty

Note :- You must have Elastic search Db running in your system


Now go to kibana and start it and open the url and configure this fluent-bit indices.

Before staring Kibana update we need to update kibana.conf file inside folder which will tell kibana where is your elasticserach db.

C:\kibana-7.12.1-windows-x86_64\config

Uncomment this line

The URLs of the Elasticsearch instances to use for all your queries.

elasticsearch.hosts: ["http://localhost:9200"]

C:\kibana-7.12.1-windows-x86_64\bin>kibana

Now create the exception and you will find the data inside Kibana