jav spring boot mastery

📌 Why Kafka?

Kafka is a distributed event streaming platform used to build real-time data pipelines and streaming apps. It’s designed to be fault-tolerant, scalable, and high-throughput.

Spring Boot integrates seamlessly with Kafka via Spring for Apache Kafka.

⚙️ 1. Add Dependencies

👉 Maven

<dependency>
  <groupId>org.springframework.kafka</groupId>
  <artifactId>spring-kafka</artifactId>
</dependency>

🛠️ 2. Kafka Setup in application.yml

spring:
  kafka:
    bootstrap-servers: localhost:9092
    consumer:
      group-id: demo-group
      auto-offset-reset: earliest
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
    producer:
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: org.apache.kafka.common.serialization.StringSerializer

✍️ 3. Create a Kafka Producer

@Service
public class KafkaProducerService {

private final KafkaTemplate<String, String> kafkaTemplate;

@Value("${topic.name.producer}")
private String topicName;

public KafkaProducerService(KafkaTemplate<String, String> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}

public void sendMessage(String message) {
kafkaTemplate.send(topicName, message);
System.out.println("Message sent: " + message);
}
}

Description:

  • Uses Spring’s KafkaTemplate to send a message.
  • @Value injects topic name from the application.yml.

📥 4. Create a Kafka Consumer

@Service
public class KafkaConsumerService {

  @KafkaListener(topics = "${topic.name.producer}", groupId = "demo-group")
  public void listen(String message) {
    System.out.println("Received message: " + message);
  }
}

Description:

  • @KafkaListener listens to the specified topic.
  • Automatically picks up messages from the broker.

🌐 5. REST Controller to Publish Messages

@RestController
@RequestMapping("/api/kafka")
public class KafkaController {

  private final KafkaProducerService producer;

  public KafkaController(KafkaProducerService producer) {
    this.producer = producer;
  }

  @PostMapping("/publish")
  public ResponseEntity<String> sendMessage(@RequestParam("message") String message) {
    producer.sendMessage(message);
    return ResponseEntity.ok("Message published to Kafka");
  }
}

🧪 6. Testing the Kafka Flow

  1. Start Kafka (via Docker or locally).
  2. Run the Spring Boot app.
  3. Send a message:
cu rl -X POST "http://localhost:8080/api/kafka/publish?message=hello"

4. Check the console for both producer and consumer logs.

🧠 Concepts Recap

ConceptDescription
ProducerSends data to a topic
ConsumerSubscribes to a topic and processes messages
TopicLogical stream of data
KafkaTemplateSpring abstraction to send messages
@KafkaListenerMethod-level listener for messages

📌 When to Use Kafka with Spring Boot?

  • Event-Driven Microservices
  • Decoupled Asynchronous Communication
  • Data Streaming Pipelines
  • Real-Time Analytics
  • Audit and Logging Events

Next Up: Part 23 – Scheduling and Asynchronous Tasks with Spring Boot

Similar Posts