Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • View All Red Hat Products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Secure Development & Architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • Product Documentation
    • API Catalog
    • Legacy Documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Introducing the Kafka-CDI Library

May 31, 2018
Matthias Wessendorf
Related topics:
Event-DrivenJavaKubernetes
Related products:
Red Hat OpenShift Container Platform

Share:

    Using Apache Kafka in modern event-driven applications is pretty popular. For a better cloud-native experience with Apache Kafka, it's highly recommended to check out Red Hat AMQ Streams, which offers an easy installation and management of an Apache Kafka cluster on Red Hat OpenShift.

    This article shows how the Kafka-CDI library can handle difficult setup tasks and make creating Kafka-powered event-driven applications for MicroProfile and Jakarta EE very easy.

    Vanilla Apache Kafka Consumers

    While the concepts of Kafka producers and consumers are simple, writing the actual code can be quiet cumbersome and requires some boilerplate code such as the following:

    final Properties props = new Properties();
    props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "my-cluster-kafka:9092");
    props.put(ConsumerConfig.GROUP_ID_CONFIG, "demo-group");
    props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    final KafkaConsumer<String, String> consumer = new KafkaConsumer(props); 
    ...
    consumer.subscribe(Collections.singleton("topic"));
    ...
    final ConsumerRecords<String, String> records = consumer.poll(500);
    for (final ConsumerRecord<String, String<> record : records) {
        logger.info(record.value());
    }
    

    The above code shows a simple configuration of a KafkaConsumer, which leaves the developer with a few tasks such as manually defining the actual type of the key and the value of the consumed record. Besides that, the consumers are also not thread-safe and the developer needs to take care of that using Java concurrency APIs.

    CDI Extensions to the Rescue

    In various enterprise Java communities, such as Eclipse MicroProfile or Jakarta EE, CDI is the natural choice for managing dependencies and their configuration. The API also allows developers to create extensions that can leverage the entire "Java EE" lifecycle and its powerful platform APIs as well.

    The Kafka-CDI library from the AeroGear project is such an extension, and it makes creating Kafka-powered applications for MicroProfile or Jakarta EE very easy.

    Kafka-CDI in Action

    The setup is simple and requires just a few lines of Maven coordinates:

    <dependency>
      <groupId>org.aerogear.kafka</groupId>
      <artifactId>kafka-cdi-extension</artifactId>
      <version>0.1.0</version>
    </dependency>

    Consumers with CDI

    Creating CDI-managed beans that act as message consumers is now quite easy and requires only a small bit of code:

    @KafkaConfig(bootstrapServers ="#{KAFKA_SERVICE_HOST}:#{KAFKA_SERVICE_PORT}")
    public class MyAwesomeConsumer {
    
        private static final Logger LOGGER = LoggerFactory.getLogger(MyAwesomeConsumer.class);
    
        @Consumer(topics = "topic", groupId = "demo-group")
        public void onMessage(final String key, final String value) {
            LOGGER.info("We got this value: " + value);
        }
    }

    With a single @Consumer annotation, the bean is turned into a message consumer. The Kafka-CDI extension handles all the configuration, such as the type deserialization for the key and value of the Kafka record, as well as the threading behind the scenes. For each application, one KafkaConfig annotation is needed to identify the list of available bootstrap servers. The values for topic, groupId, and bootstrapServers, as in this example, can be resolved using environment variables or system properties.

    Producers with CDI

    Writing producers is easy as well:

    @Producer
    private SimpleKafkaProducer<String, String> myproducer;
    ...
    myproducer.send("topic", myKey, myValue);
    

    Any bean can be injected with a convenient producer (SimpleKafkaProducer) that can be used in any method for sending messages to the Kafka cluster. Similar to the consumer example, the CDI extension handles the type serialization for the key and value.

    KStreams with CDI

    Last but not least, the library has some initial support for working with the Kafka Streams API:

    @KafkaStream(input = "input.topic", output = "output.topic")
    public KTable<String, Long> successfullMessagesPerJobTransformer(KStream<String, String> source) {
      return successCountsPerJob = source.filter((key, value) -> value.equals("Success"))
        .groupByKey()
        .count("successCounter");
    }

    When a method is annotated with @KafkaStream, the library processes the passed KStream object by executing the annotated method. The input and output attributes from the @KafkaStream annotation tell the library which topic to read from and which topic the stream processing job should be written to. The type serialization and deserialization are again handled by the framework for the convenience of the developer.

    Supported Types

    The library  supports various ways for JSON serialization and deserialization. It comes with Serializer and Deserializer for the javax.json.Json type, and it has a fallback mechanism to automatically use the Jackson library for any unknown type. Support for Apache Avro is not yet implemented but that is on the roadmap of features.

    Conclusion

    This article was an introduction to the Kafka-CDI library in action. Only a few lines of code are required to get started with the library. The focus is on user simplicity, while all the difficult setup tasks are handled by the CDI extension.

    Using the library, it's a no-brainer to write event-driven applications using MicroProfile implementations such Thorntail.

    Thanks to Gunnar Morling for valuable input!

    Last updated: February 22, 2024

    Recent Posts

    • Why some agentic AI developers are moving code from Python to Rust

    • Confidential VMs: The core of confidential containers

    • Benchmarking with GuideLLM in air-gapped OpenShift clusters

    • Run Qwen3-Next on vLLM with Red Hat AI: A step-by-step guide

    • How to implement observability with Python and Llama Stack

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue