Introduction to Kafka in Podman

Kafka is a powerful event streaming platform. In this learning path, you will learn about Kafka and how it can be run in a container in Podman. This understanding is a good prerequisite to running Kafka in Red Hat OpenShift.

Access the Developer Sandbox

In this lesson, you'll manage messages using a consumer.

Launch a consumer

In this lesson, you will launch the consumer to fetch messages from the Kafka instance and producer from lessons one and two, respectively. The consumer is a Linux image, an application written in Node.js. Of course, there are several choices of supported programming languages.

Note

The topic (my-topic), consumer id (my-consumer) and consumer group (my-consumer-group) are hard-coded into the application. This is probably not a good idea for a production environment.

With the bits from Lessons 1 and 2 up and running, it is time to consume messages.

podman run --rm -it --net kafkanet --name kafkaconsumer quay.io/rhdevelopers/node-kafka-consumer:latest

The previously-entered messages will appear. As you create new messages at the producer command line, they will appear here.

Clean up

You can end this application using the following command:

podman stop kafkaconsumer

You can end the producer and the Kafka instance using the following command:

podman stop kafkaproducer
podman stop kafkaserver

Summary

These three lessons have introduced you to the basic functionality of Kafka, with one example of how you can run Kafka in a container for development purposes. For a more robust production environment, use Streams for Apache Kafka on OpenShift. You can view the source code for this application in the Kafka In Podman GitHub repository. 

For more information related to this lesson, visit:

Previous resource
Run a Kafka producer application