Apache Kafka Streams

Getting Started with Kafka Streams: Stream Processing Made Easy

1. Introduction

Kafka Streams is a powerful library in the Apache Kafka ecosystem that enables real-time stream processing of data. With Kafka Streams, you can build robust and scalable applications that process, transform, and analyze data streams in a distributed and fault-tolerant manner. In this article, we'll explore the basics of Kafka Streams and walk through some code samples to get you started on your stream processing journey, here you will be able to checkout our repository

2. Setting up the Project

To begin, let's set up a new Maven or Gradle project and add the necessary dependencies for Kafka Streams. You can use your preferred build tool and IDE for this.

  • Maven Dependency

  • Gradle Dependency

                            
    dependencies {
     implementation 'org.apache.kafka:kafka-streams:2.8.0'
    }                   
                            
                        

3. Creating a Simple Kafka Streams Application

Now, let's dive into writing a basic Kafka Streams application. In this example, we'll create a stream that reads from an input topic, performs a simple transformation, and writes to an output topic.

                        
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.streams.KafkaStreams;
import org.apache.kafka.streams.StreamsBuilder;
import org.apache.kafka.streams.StreamsConfig;
import org.apache.kafka.streams.kstream.KStream;
import java.util.Properties;
public class SimpleKafkaStreamsApp {

public static void main(String[] args) {
   Properties props = new Properties();
   props.put(StreamsConfig.APPLICATION_ID_CONFIG, "simple-streams-app");
   props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
   props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
   props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());
   props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
   props.put(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, "true");
   StreamsBuilder builder = new StreamsBuilder();
   KStream inputStream = builder.stream("input-topic");
   KStream transformedStream = inputStream.mapValues(value -> value.toUpperCase());
   transformedStream.to("output-topic");
   KafkaStreams streams = new KafkaStreams(builder.build(), props);
   streams.start();
   // Gracefully shut down the application
   Runtime.getRuntime().addShutdownHook(new Thread(streams::close));
 }
}                          
                        
                    

In this code snippet, we configure the necessary properties for our Kafka Streams application. We specify the application ID, bootstrap servers, and default serializers/deserializers for key and value. We also set some additional properties like `auto.offset.reset` and `enable.idempotence`.

Next, we create a `StreamsBuilder` and define the stream processing logic. We start by creating a stream from the "input-topic" using `builder.stream()`. Then, we apply a simple transformation using `mapValues()` to convert the values to uppercase. Finally, we write the transformed stream to the "output-topic" using `to()`.

After that, we create a `KafkaStreams` instance with the builder's built topology and properties. We start the stream processing by invoking `streams.start()`. To ensure a graceful shutdown, we add a shutdown hook to close the streams when the application terminates.

4. Conclusion

Kafka Streams simplifies the development of real-time stream processing applications by providing a high-level DSL and powerful abstractions for working with data streams. In this article, we covered the basics of setting up a Kafka Streams application and processing a simple stream. Now you have a foundation to explore more advanced concepts and unleash the full potential of Kafka Streams in your stream processing projects.

Remember to consult the official Kafka Streams documentation and resources for more in-depth knowledge and advanced topics. Happy stream processing with Kafka Streams!