Spring Boot - Consume Message Through Kafka, Save into ElasticSearch, and Plot into Grafana

Last Updated : 26 Mar, 2026

In modern applications, processing real-time data efficiently is key. This tutorial shows how to integrate Spring Boot with Kafka for message consumption, Elasticsearch for storing and indexing data, and Grafana for real-time visualization.

  • Consume messages from Kafka using Spring Boot.
  • Store and index consumed messages in Elasticsearch.
  • Visualize data in Grafana dashboards.

Why do we mention the version?

Version compatibility is of paramount importance in the case of Elastic and Spring Boot. If your elastic-search version doesn't match with the spring boot version or vice versa, then you face problems in configuring both. Below is the list of version compatibility : 

Spring Data Elasticsearch

Elastic-Search

Spring framework

Spring Boot

4.4.X

7.17.3

5.3.X

2.7.X

4.3.X

7.15.2

5.3.X

2.6.X

4.2.X

7.12.0

5.3.X

2.5.X

4.1.X

7.9.3

5.3.2

2.4.X

4.0.X

7.6.2

5.2.12

2.3.X

3.2.X

6.8.12

5.2.12

2.2.X

3.1.X

6.2.2

5.1.19

2.1.X

3.0.X

5.5.0

5.0.13

2.0.X

2.1.X

2.4.0

4.3.25

1.5.X

Steps to Implement Kafka Producer, Consumer with Elasticsearch and Grafana

Step 1: Download Required Tools

Download the required software:

  • Elastic-Search
  • Apache Kafka

Download and extract from your system.

pom.xml – Contains required dependencies.

Java
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="https://maven.apache.org/POM/4.0.0"
         xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="https://maven.apache.org/POM/4.0.0 
                        https://maven.apache.org/xsd/
                        maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.7.0</version>
        <relativePath/> 
    </parent>
    <groupId>com.example</groupId>
    <artifactId>Producer</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>Producer</name>
    <description>Demo project for Spring Boot</description>
    <properties>
        <java.version>1.8</java.version>
    </properties>
    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-devtools</artifactId>
            <scope>runtime</scope>
            <optional>true</optional>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka-test</artifactId>
            <scope>test</scope>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

</project>

application.properties

server.port=1234

Step 2: Create Kafka Producer Application

Create a spring boot project named Producer.ProducerApplication.java -> Main class to run the application.

ProducerApplication.class

Java
package com.example.demo;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class ProducerApplication {
    public static void main(String[] args) {
        SpringApplication.run(ProducerApplication.class, args);
    }
  
}

Step 3: Create Configuration class.

In this class, we have provided the configuration of Kafka. KafkaProducerConfig.java -> Contains Kafka configuration.

KafkaProducerConfig.java 

Java
package com.example.demo.config;

import java.util.HashMap;
import java.util.Map;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.DefaultKafkaProducerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;
import org.springframework.kafka.support.serializer.JsonSerializer;
import com.example.demo.model.User;
@Configuration
public class KafkaProducerConfig {
  
    @Bean
    public ProducerFactory<String, User> userProducerFactory() {
        Map<String, Object> configProps = new HashMap<>();
        configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,"localhost:9092");
        configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, 
                        StringSerializer.class.getName());
        configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, 
                        JsonSerializer.class.getName());
        return new DefaultKafkaProducerFactory<>(configProps);
    }
  
    @Bean
    public KafkaTemplate<String, User> userKafkaTemplate() {
        return new KafkaTemplate<>(userProducerFactory());
    }
  
}

Step 4: Create User class

Model class where we store the user information.

User.java

Java
package com.example.demo.model;

public class User {
    int id;
    String name;
    String pdate;
    public User() {
        super();
    }
    public User(int id, String name, String pdate) {
        super();
        this.id = id;
        this.name = name;
        this.pdate = pdate;
    }
    public int getId() {
        return id;
    }
    public void setId(int id) {
        this.id = id;
    }
    public String getName() {
        return name;
    }
    public void setName(String name) {
        this.name = name;
    }
    public String getPdate() {
        return pdate;
    }
    public void setPdate(String pdate) {
        this.pdate = pdate;
    }
}

Step 5: Create Service class

Create Service class to write the logic. Service class uses kafka template to send the data to the consumer.
Uses KafkaTemplate to send data to Kafka. 

KafkaService.java

Java
package com.example.demo.service;

import java.util.List;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;
import com.example.demo.model.User;

@Service
public class KafkaService {
    private final Logger LOG = LoggerFactory.getLogger(KafkaService.class);
    @Autowired
    private KafkaTemplate<String, User> kafkaTemplate;
    String kafkaTopic = "gfg";
    public void send(User user) {
        LOG.info("Sending User Json Serializer : {}", user);
        kafkaTemplate.send(kafkaTopic, user);
    }
    public void sendList(List<User> userList) {
        LOG.info("Sending UserList Json Serializer : {}", userList);
        for (User user : userList) {
            kafkaTemplate.send(kafkaTopic, user);
        }
    }
}

Step 6: Create the controller class.

Create the Controller class. REST controller for sending API requests.

ProducerController.java

Java
package com.example.demo.controller;

import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RestController;

import com.example.demo.model.User;
import com.example.demo.service.KafkaService;

@RestController
public class ProducerController {
    @Autowired
    KafkaService kafkaProducer;
  
    @PostMapping("/producer")
    public String sendMessage(@RequestBody User user) {
        kafkaProducer.send(user);
        return "Message sent successfully to the Kafka topic shubham";
    }
  
    @PostMapping("/producerlist")
    public String sendMessage(@RequestBody List<User> user) {
        kafkaProducer.sendList(user);
        return "Message sent successfully to the Kafka topic shubham";
    }
}

Create Kafka Consumer and Configure the ElastisSearch Application

Create another spring boot application named ElasticConsumer.

Step 1. Create Application class.

Create Application class ->Main class to run the consumer application.

ComsumerApplication.java

Java
package com.example.demo;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;
import com.example.demo.model.User;
import com.example.demo.service.KafkaUserService;

@SpringBootApplication
@RestController
public class ConsumerApplication {
    @Autowired
    KafkaUserService kafkaUserService;
  
    public static void main(String[] args) {
        SpringApplication.run(ConsumerApplication.class, args);
    }
  
    @KafkaListener(topics = "gfg", groupId = "gfg-group")
    public void listen(User user) {
        System.out.println("Received User information : " + user.toString());
        try {
            kafkaUserService.saveUser(user);
        } catch (Exception e) {
            e.printStackTrace();    
        }
    }
  
    @GetMapping("/getElasticUserFromKafka")
    public Iterable<User> findAllUser() {
        return kafkaUserService.findAllUsers();
    }
}

Step 2. Create Config class.

This class Contains Kafka consumer configuration.

KafkaConsumerConfig.java

Java
package com.example.demo.config;

import java.util.HashMap;
import java.util.Map;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.annotation.EnableKafka;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.support.serializer.JsonDeserializer;
import com.example.demo.model.User;

@EnableKafka
@Configuration
public class kafkaConsumerConfig {
    @Bean
    public ConsumerFactory<String, User> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "gfg-group");
        return new DefaultKafkaConsumerFactory<>(props,
            new StringDeserializer(), new JsonDeserializer<>(User.class));
    }
  
    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, User>
      kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, User> factory =
          new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
  
}

Step 3: Create User class

Model class used for storing data in Elasticsearch.

Important annotations:

  • @Document : Defines the Elasticsearch index
  • @Id : Unique identifier for each document
  • @Field :Specifies field type inside the document
Java
package com.example.demo.model;

import java.util.Date;
import org.springframework.data.annotation.Id;
import org.springframework.data.elasticsearch.annotations.Document;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
import com.google.gson.Gson;

@Document(indexName = "kafkauser")
public class User {
    @Id
    int id;
    @Field(type = FieldType.Text, name = "name")
    String name;
    @Field(type = FieldType.Date, name = "pdate")
    Date pdate;
    public User() {
        super();
    }
    public User(int id, String name, Date pdate) {
        super();
        this.id = id;
        this.name = name;
        this.pdate = pdate;
    }
    public Date getPdate() {
        return pdate;
    }
    public void setPdate(Date pdate) {
        this.pdate = pdate;
    }
    public int getId() {
        return id;
    }
    public void setId(int id) {
        this.id = id;
    }
    public String getName() {
        return name;
    }
    public void setName(String name) {
        this.name = name;
    }
    @Override
    public String toString() {
        return new Gson().toJson(this);
    }
}

Step 4. Create Repository class

Repository interface used to perform operations with Elasticsearch.

KafkaUserRepository.java

Java
package com.example.demo.repository;

import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
import org.springframework.stereotype.Repository;
import com.example.demo.model.User;

@Repository
public interface KafkaUserRepository extends ElasticsearchRepository<User,String> {
  
}

Step 5. Create Service class

Service class that consumes Kafka messages and stores them in Elasticsearch.

  • This application receives messages from Kafka and saves them to Elasticsearch.

KafkaUserService.java

Java
package com.example.demo.service;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import com.example.demo.model.User;
import com.example.demo.repository.KafkaUserRepository;

@Service
public class KafkaUserService {
    @Autowired
    private KafkaUserRepository edao;
  
    public void saveUser(User user) {
        edao.save(user);
    }
  
    public Iterable<User> findAllUsers() {
    return edao.findAll();
    }    
}

Step 6: Run Elasticsearch

Open Command Prompt and navigate to the Elasticsearch bin folder.

Run the following command:

elasticsearch.bat

After starting, open the browser and check:

http://localhost:9200

If Elasticsearch is running successfully, it will display cluster information in JSON format.

  • Elasticsearch must run before starting the consumer application.

Run Producer and ElasticConsumer Spring Application

Send JSON data using postman. Here data is sent by the Producer app to ElasticConsumer. And in the ElasticConsumer console data would be printed and saved into ElasticSearchDB in the form of JSON.

Producer App APIs:

  • Send single object -> http://localhost:1234/producer
  • Send list of objects -> http://localhost:1234/producerlist

ElasticConsumer app APIs

  • Fetch all records from elastic db -> localhost:8080/getElasticUserFromKafka

Grafana Dashboard

Grafana dashboard is running on http://localhots:3000. Watch the configuration video below.

Output

Output Video 1:

Output Video 2:

Some ElasticSearch APIs

  • To show the records of index -> http://localhost:9200/<index_name>/_search
  • To Delete index -> http://localhost:9200/<index_name>
  • List all indices -> http://localhost:9200/_cat/indices
  • Show schema of index -> http://localhost:9200/<index_name>
Comment