Introduction to Kafka in Podman

Kafka is a powerful event streaming platform. In this learning path, you will learn about Kafka and how it can be run in a container in Podman. This understanding is a good prerequisite to running Kafka in Red Hat OpenShift.

Access the Developer Sandbox

In this lesson, you will start a producer and manage messages at the command line. 

What is a producer?

Before you start using Podman and running Kafka, you need the following basic information. According to the official website, “Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications.”

Kafka allows you to send messages from one or more producers, with those messages stored until retrieved by one or more consumers. When producing a message, you assign a topic. You can use that topic later to retrieve only the messages you want for your application. Each consumer uses a unique, developer-assigned consumer identifier. This allows Kafka to keep track of which messages have been consumed by a consumer. This way, users can disconnect at any time and, later, reconnect and pick up right where they left off.

Using the Kafka instance started in the previous lesson, we will start a producer (a Linux image written in Node.js). This producer will allow you to enter messages at the command line. In a later lesson, you will start a consumer to retrieve and display the messages.

Note

It is typical, but not required, for messages to be in the form of a JSON document.

In our example, the topic (“my-topic”) is hard-coded. This is not a good idea for production. At the command line, run the following command:

podman run --rm-it --net kafkanet --name kafkaproducer quay.io/rhdevelopers/node-kafka-producer:latest

You will be greeted with a command line where you can enter messages:

Kafka producer connected.
Enter message (type "exit" to quit):

You can now enter messages as you wish. Enter a few messages and do not exit the program. For maximum visual impact, keep this window open and position an empty command line window next to it. When you create a consumer in the next lesson, you will see messages sent and delivered in real time.

Summary

The Kafka instance and producer are running. In the next lesson, you will start a consumer to use the Kafka server to consume messages. You can view the source code for this application in the Kafka In Podman Git repository.

For more information related to this lesson, visit:

Previous resource
Run Apache Kafka in Podman
Next resource
Run a Kafka consumer application