📌 Why Kafka?
Kafka is a distributed event streaming platform used to build real-time data pipelines and streaming apps. It’s designed to be fault-tolerant, scalable, and high-throughput.
Spring Boot integrates seamlessly with Kafka via Spring for Apache Kafka.
⚙️ 1. Add Dependencies
👉 Maven
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
🛠️ 2. Kafka Setup in application.yml
spring:
kafka:
bootstrap-servers: localhost:9092
consumer:
group-id: demo-group
auto-offset-reset: earliest
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.apache.kafka.common.serialization.StringSerializer
✍️ 3. Create a Kafka Producer
@Service
public class KafkaProducerService {
private final KafkaTemplate<String, String> kafkaTemplate;
@Value("${topic.name.producer}")
private String topicName;
public KafkaProducerService(KafkaTemplate<String, String> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}
public void sendMessage(String message) {
kafkaTemplate.send(topicName, message);
System.out.println("Message sent: " + message);
}
}
Description:
- Uses Spring’s
KafkaTemplate
to send a message. @Value
injects topic name from theapplication.yml
.
📥 4. Create a Kafka Consumer
@Service
public class KafkaConsumerService {
@KafkaListener(topics = "${topic.name.producer}", groupId = "demo-group")
public void listen(String message) {
System.out.println("Received message: " + message);
}
}
Description:
@KafkaListener
listens to the specified topic.- Automatically picks up messages from the broker.
🌐 5. REST Controller to Publish Messages
@RestController
@RequestMapping("/api/kafka")
public class KafkaController {
private final KafkaProducerService producer;
public KafkaController(KafkaProducerService producer) {
this.producer = producer;
}
@PostMapping("/publish")
public ResponseEntity<String> sendMessage(@RequestParam("message") String message) {
producer.sendMessage(message);
return ResponseEntity.ok("Message published to Kafka");
}
}
🧪 6. Testing the Kafka Flow
- Start Kafka (via Docker or locally).
- Run the Spring Boot app.
- Send a message:
cu rl -X POST "http://localhost:8080/api/kafka/publish?message=hello"
4. Check the console for both producer and consumer logs.
🧠 Concepts Recap
Concept | Description |
---|---|
Producer | Sends data to a topic |
Consumer | Subscribes to a topic and processes messages |
Topic | Logical stream of data |
KafkaTemplate | Spring abstraction to send messages |
@KafkaListener | Method-level listener for messages |
📌 When to Use Kafka with Spring Boot?
- Event-Driven Microservices
- Decoupled Asynchronous Communication
- Data Streaming Pipelines
- Real-Time Analytics
- Audit and Logging Events
✅ Next Up: Part 23 – Scheduling and Asynchronous Tasks with Spring Boot