Apache kafka with Spring Boot Microservice – JavaDream
July 14, 2020 | Spring boot complete tutorial with example | 4 Comments
Hi, All in this article we see how to Integrate Apache Kafka with Spring Boot. As we all know Kafka is an open-source messaging queue and it is scalable, fault-tolerant, and offers high performance.
Before Use Kafka Let’s see a live scenario example for why we use Apache Kafka
Suppose we buy a product from an online store and we want to give feedback on this product.
- If our application uses synchronous calls to update the feedback then the user has to wait until we get the response from our APIs. This is a bad practice because the user does not care about the background process but he waits for a success message because of our Apis call.
what if an exception occurs while calling Apis then the user gets a message like something went wrong please try again after waiting for 4-5 second( time to get a response from Apis).
2. If our application uses an Asynchronous call to update feedback. In this case, the user gets a response message immediately and the Apis call works in the background.
What if an exception occurs while calling this Apis. And we have already shown the successful response to the user now what to do.
So both the synchronous and Asynchronous methods will not work in this scenario. So to overcome this situation we use a messaging queue. If we use message queue then when we get a request for feedback update we put this request payload on a topic and after getting acknowledgment we return a success message to the user.
Note: Getting acknowledgment from the message queue will take less time as compared to getting a response after a synchronous Apis call.
And if any exception occurs in this scenario while Apis call we can implement retry logic in the message queue.
To Integrate apache kafka with spring boot We have to install it. It is open source you can download it easily. Here i am installing it in Ubuntu.
Below are the steps to install Apache Kafka on the Ubuntu machine.
- Go to the Apache Kafka website
- Choose any version from Binary Downloads and download the .tgz file.
- Extract this .tgz file.
- Always remember you have to start zookeeper before starting your Kafka server.
- Go to your bin folder and start zookeeper.
- Now stay in the bin folder and run your Kafka server.
Follow Above Steps:
1- Go to Apache kafka website and download the binary version.

2- Extract this .tgz file using below command.
tar -xvf kafka_2.12-2.5.0.tgz
3- Now Go to your apache kafka bin folder and run below command to start zookeper
./zookeeper-server-start.sh ../config/zookeeper.properties
4- Now run below command to run your kafka server .
./kafka-server-start.sh ../config/server.properties
your Apache Kafka server has been started Now we have to create a Spring boot project and Integrate this Kafka server with that. In this example, we create a simple producer-consumer Example means we create a sender and a client. Sender Simply sends a message a client will consume this message. For creating a Spring boot application we have to follow the below steps:
- Create a Spring Boot Project and add a required dependency in the pom.xml file.
- Define Kafka related properties in your application.yml or application.properties file.
- Producer class that writes messages on Kafka Topic.
- Create a Consumer class that reds messages from Kafka Topic.
- Create a Controller class and make an endPoint to send a message using postman or your frontend application.
Follow Above Steps:
1- Create a Spring Boot Project and add a required dependency in the pom.xml file. You can also copy this dependency from maven Repository
<dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> </dependency>
2- Define Kafka related properties in your application.yml or application.properties file.
As you know you can either create an application.yml or application.properties file. But many developers prefer applications. properties and many will prefer application.yml so I am sharing both the files use which one you like.
application.properties
server.port=8000 spring.kafka.producer.bootstrap=localhost:9092 spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer spring.kafka.consumer.bootstrap=localhost:9092 spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer spring.kafka.consumer.group-id=group_id
application.yaml
server: port: 8000 spring: kafka: producer: bootstrap: localhost:9092 key-serializer: org.apache.kafka.common.serialization.StringSerializer value-serializer: org.apache.kafka.common.serialization.StringSerializer consumer: bootstrap: localhost:9092 group-id: group_id auto-offset-reset: earliest key-deserializer: org.apache.kafka.common.serialization.StringDeserializer value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
If you see in the above file here we simply define a server port as 8000 means my spring boot application is running on an 8000 port. And I also define Kafka as producer and consumer. Here in the producer part bootstrap is used to define the Kafka port as I have to install Kafka in my local machine so I have given the path localhost:9092. In the Producer part, there are two more keys one is key-serializer and the other is value-serializer. If you know about Kafka then you know that Kafka uses key-value for sending messages and serialized them so here we use a simple String message so we use StringSerializer.
Now discuss the consumer part here bootstrap is the same as a producer it defines my Kafka server path. Here group-id means in Kafka we have to define a topic to send and receive a message. The sender will write the message to this topic and the consumer will read the message from this topic. There is a chance that many consumers will read from the same topic so we define a group-id and assign the consumer that group-id. key-deserializer and value-deserializer are used to deserialize the message that sends by the producer.
3- Make a Producer class that writes message on Kafka Topic.
package com.vasu.SpringBootKafka.service; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.kafka.core.KafkaTemplate; import org.springframework.stereotype.Service; @Service public class Producer { private static final Logger logger = LoggerFactory.getLogger(Producer.class); private static final String TOPIC = "demo"; @Autowired private KafkaTemplate<String, Object> kafkaTemplate; public void sendMessage(String message) { logger.info("vvv:: send message"); kafkaTemplate.send(TOPIC, message); } }
If you see the above class code here we define a topic with a name demo. And autowired the KafkaTemplate. This class simply writes the message on the demo topic using KafkaTemplate.
4- Now Make Consumer class that reds message from Kafka Topic.
package com.vasu.SpringBootKafka.service; import java.io.IOException; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.kafka.annotation.KafkaListener; import org.springframework.stereotype.Service; import com.fasterxml.jackson.databind.ObjectMapper; @Service public class Consumer { private static final Logger logger = LoggerFactory.getLogger(Consumer.class); @Autowired private final ObjectMapper mapper = new ObjectMapper(); @KafkaListener(topics = "demo", groupId = "group_id") public void consume(String message) throws IOException { logger.info(String.format("consumed message is= ", message)); } }
In the above class code, we simply consume the message on the demo topic and print this message in the console. Here we use KafkaListener annotation to read messages from a given topic.
5- Now make a Controller class and make a endPoint to send a message using postman or your frontend application.
package com.vasu.SpringBootKafka.controller; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.web.bind.annotation.PostMapping; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RequestParam; import org.springframework.web.bind.annotation.RestController; import com.vasu.SpringBootKafka.service.Producer; @RestController @RequestMapping("/kafka") public class KafkaController { private final Producer producer; @Autowired KafkaController(Producer producer) { this.producer = producer; } @PostMapping(value = "/publish") public void sendMessageToKafkaTopic(@RequestParam("message") String message) { this.producer.sendMessage(message); } }
Here we simply define a controller class and make an endPoint. Now run your spring boot application and open postman and hit a post request with message parameter and see your eclipse console you will get the message.


You can also see all the messages of this topic in your Kafka server console. Just go to your Kafka server bin folder and run the below command.
./kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic demo --from-beginning
Github URL – https://github.com/vasurajput/Spring-Boot-Web/tree/master/SpringBootKafka
You may also like.
How to cache data in springboot
Swagger in Springboot application with Example.