Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • View All Red Hat Products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Secure Development & Architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • Product Documentation
    • API Catalog
    • Legacy Documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Data Streaming with WildFly Swarm and Apache Kafka

 

October 24, 2017
Ken Finnigan
Related topics:
Event-DrivenMicroservices
Related products:
Streams for Apache KafkaRed Hat OpenShift

Share:

    At the beginning of October, I attended JavaOne in San Francisco to present on WildFly Swarm and Apache Kafka. For those of you that weren't able to attend the session, or for those that did and saw first hand the issues with the demo, I will be covering all the details of how the demo should work!

    The presentation material that was presented at JavaOne can be found here, and all the code for the demos is in GitHub.

    MiniShift Setup

    To get started with the demo we need to install MiniShift. We also need the oc binary on our path, which MiniShift provides.

    Once installed we want to start it with a bit more than the minimum:

    minishift start --cpus 3 --memory 4GB

    Once MiniShift has started and OpenShift is running, open up the console. You can either create a new project or use the default one created for you.

    Apache Kafka Setup

    The first task is to set up Apache Kafka. Lucky for us the EnMasse project has a handy OpenShift template we can use. Select Add to Project and then Import YAML/JSON. Paste in the raw text from the OpenShift template, and click Create. It will ask what you want to do, select to Process the template.

    Head over to a terminal window and run:

    oc login

    entering developer and password as the credentials.

    Once logged into OpenShift from the terminal run:

    oc get services

    This provides all the details of running services within OpenShift. If you don't see anything the first time, go back and check the console to see if the Kafka and Zookeeper pods have started ok.

    The important service we need is the zookeeper. Its cluster IP address is required when creating a Kafka Topic.

    Kafka Topic

    To see all the running pods in the OpenShift console select Applications and then Pods in the UI. Select a Kafka instance and select Terminal. A terminal window for the instance should now be visible in the web browser. We can create a topic with the following command:

    ./bin/kafka-topics.sh --create --topic fruit_topic --replication-factor 2 --partitions 3 --zookeeper 172.30.123.92:2181

    The zookeeper URL in the command above is the one we saw from oc get services.

    We've now successfully configured Kafka for data streaming, now we need some services to interact with it.

    WildFly Swarm

    We won't cover all aspects of the services we're creating as they're detailed on GitHub. We will focus on the integration with Kafka.

    Let's start with a simple RESTFul service to store fruit names in a database. FruitResource is a simple JAX-RS Resource class that provides the RESTFul endpoints for GET, POST, and PUT. Each method interacts with the data within the database only.

    To make it more interesting, we want to send an Event to Kafka. In Kafka, an Event is a combination of key, value, and timestamp. Each Event is persisted and cannot be altered.

    We will use a CDI extension from Aerogear to help us integrate with Kafka. It's fairly new but is actively being enhanced. First, we need to make it available to our project with the following Maven dependency:

    <dependency>
      <groupId>net.wessendorf.kafka</groupId>
      <artifactId>kafka-cdi-extension</artifactId>
      <version>0.0.11</version>
    </dependency>

    Produce an Event

    For our JAX-RS Resource to be able to send an event to Kafka, we need to provide some configuration:

    @KafkaConfig(bootstrapServers = "#{KAFKA_SERVICE_HOST}:#{KAFKA_SERVICE_PORT}")

    Here we use environment variables in OpenShift to find where Kafka is located. With this approach, the configuration is super easy.

    With Kafka configured, we now need to access a Producer to send an event to:

    @Producer
    private SimpleKafkaProducer<Integer, Fruit> producer;

    Since we're dealing with Fruit instances, we want to send an event that has a key of Integer and a value that is a Fruit instance. As a result, we can subsequently send an event on creating like:

    producer.send("fruit_topic", fruit.getId(), fruit);

    A point of note is that the topic name we use when calling send() must match the topic name we created in Kafka earlier.

    Finally lets run our service in MiniShift! Navigate to /rest-data and run:

    mvn clean fabric8:deploy -Popenshift

    Consequently, we can access the OpenShift Console and open the route that was created for our service. On the web page, we will see a list of fruit that we can add or update the name.

    Consume an Event

    First, we use the same method to define configuration for our consumer as we did with our producer.

    Furthermore, we then need a way to consume events that we receive from Kafka:

    @Consumer(topics = "fruit_topic", keyType = Integer.class, groupId = "fruit_processor")
    public void processFruit(final Integer key, final Fruit fruitData) {
      logger.error("We received: " + fruitData);
    }

    The key points in our use of @Consumer are that we define the same topic name as our producer, so we can receive the correct events, and that we provide a unique consumer group for Kafka.

    Finally lets run our service in MiniShift! Navigate to /log-consumer and run:

    mvn clean fabric8:deploy -Popenshift

    All we're doing is logging out the details of the Fruit instance we received, which can be viewed from within the logs of the log-consumer service within the OpenShift console.

    In addition, you will notice that you only see messages in the log if you make changes to rest-data once log-consumer is running. That's because our consumer defaults to only reading messages that occur after its initialization.

    It's also possible to request a replay of every event on the topic that exists by adding offset = "earliest" onto the @Consumer annotation.

    Conclusion

    You've just experienced a whirlwind tour of integrating Apache Kafka into your WildFly Swarm microservices!

    I hope it's opened your ideas about what can be achieved with integrating services and event-driven systems.

    Please provide feedback to the CDI extension for Kafka, providing ideas and suggestions on how to improve it.


    To build your Java EE Microservice visit WildFly Swarm and download the cheat sheet.

    Last updated: October 5, 2023

    Recent Posts

    • Why some agentic AI developers are moving code from Python to Rust

    • Confidential VMs: The core of confidential containers

    • Benchmarking with GuideLLM in air-gapped OpenShift clusters

    • Run Qwen3-Next on vLLM with Red Hat AI: A step-by-step guide

    • How to implement observability with Python and Llama Stack

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue