Skip to main content
Redhat Developers  Logo
  • Products

    Featured

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat OpenShift AI
      Red Hat OpenShift AI
    • Red Hat Enterprise Linux AI
      Linux icon inside of a brain
    • Image mode for Red Hat Enterprise Linux
      RHEL image mode
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • Red Hat Developer Hub
      Developer Hub
    • View All Red Hat Products
    • Linux

      • Red Hat Enterprise Linux
      • Image mode for Red Hat Enterprise Linux
      • Red Hat Universal Base Images (UBI)
    • Java runtimes & frameworks

      • JBoss Enterprise Application Platform
      • Red Hat build of OpenJDK
    • Kubernetes

      • Red Hat OpenShift
      • Microsoft Azure Red Hat OpenShift
      • Red Hat OpenShift Virtualization
      • Red Hat OpenShift Lightspeed
    • Integration & App Connectivity

      • Red Hat Build of Apache Camel
      • Red Hat Service Interconnect
      • Red Hat Connectivity Link
    • AI/ML

      • Red Hat OpenShift AI
      • Red Hat Enterprise Linux AI
    • Automation

      • Red Hat Ansible Automation Platform
      • Red Hat Ansible Lightspeed
    • Developer tools

      • Red Hat Trusted Software Supply Chain
      • Podman Desktop
      • Red Hat OpenShift Dev Spaces
    • Developer Sandbox

      Developer Sandbox
      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Secure Development & Architectures

      • Security
      • Secure coding
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
      • View All Technologies
    • Start exploring in the Developer Sandbox for free

      sandbox graphic
      Try Red Hat's products and technologies without setup or configuration.
    • Try at no cost
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • Java
      Java icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • API Catalog
    • Product Documentation
    • Legacy Documentation
    • Red Hat Learning

      Learning image
      Boost your technical skills to expert-level with the help of interactive lessons offered by various Red Hat Learning programs.
    • Explore Red Hat Learning
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Connecting to your Managed Kafka instance from the Developer Sandbox for Red Hat OpenShift

April 23, 2021
Don Schenck
Related topics:
KafkaKubernetes
Related products:
Developer Sandbox

Share:

    Running Proof of Concept (PoC) code on your local machine is great and definitely worthwhile, but the real test comes when you must face the fallacies of distributed computing and run things in the cloud. This is where you make the leap from PoC to more of an emulation of real life.

    In this article, I will guide you through the process of using Red Hat OpenShift Streams for Apache Kafka from code running in a different cluster on the Developer Sandbox for Red Hat OpenShift. All of the code will be running in the cloud, and you will understand the attractions of distributed computing.

    Here's a broad overview of what this process entails:

    1. Create a Kafka instance in Managed Kafka (samurai-pizza-kafkas).
    2. Create a topic (prices).
    3. Confirm all this from the command line using the rhoas command-line interface (CLI).
    4. Create an application in Developer Sandbox using the image at quay.io/rhosak/quarkus-kafka-sb-quickstart:latest.
    5. Bind the service to your application from the command line.
    6. See the results.

    Prerequisites

    Here's what you'll need to follow along with this tutorial:

    • An OpenShift Streams for Apache Kafka account
    • A Developer Sandbox for Red Hat OpenShift account
    • The rhoas CLI tool
    • The OpenShift CLI oc

    What you won't need is a specific operating system. The beauty of working in the cloud is that the burden of operating systems, libraries, connections, etc., is "out there," and not at your machine. You're simply controlling it. You can literally control millions of dollars worth of computing power from an underpowered PC running a terminal session. Processing is now done in the cloud; your local PC is simply for issuing commands. That's the power of leverage. That's pretty cool.

    The event of the season

    Events are everywhere. An event has a time when it happened and information about itself. These two things give an event meaning. To be quite philosophical about all this: All of existence is a series of events.

    For this article, we have an imaginary scenario. Samurai Pizza has been struggling lately, and a large group of hedge fund managers decide to "short" the stock. Learning this—they read it on the internet—a grassroots organization resolves to buy up large swaths of Samurai Pizza stock in order to, in the vernacular, "teach those fund managers a lesson." This results in a very volatile stock price.

    (I'm totally making this up and it has no correlation to anything in real life. This is just a game; stop with the comparisons.)

    Our application produces the prices and sends them over to Kafka, which then forwards the event to our consumer. Finally, a web page is updated with the new price. Hilarity ensues as the hedge fund managers and grassroots individuals battle it out.

    Figure 1 shows an architectural overview of our subject.

    Managed Kafka architecture diagram
    Figure 1: Managed Kafka architecture diagram.

    Free Managed Kafka trial

    The first step is to get your free trial of Red Hat OpenShift Streams for Apache Kafka. It's a simple process, there's no charge or credit card number required, and it's a nice way to start experimenting without any installations needed. This is, truly, the cloud at its best. You can find it here.

    Creating the Kafka instance and topic

    Navigate your way to the Kafka Instances page shown in Figure 2. Click the big, blue Create Kafka instance button to start.

    Kafka instances page with no instances listed
    Figure 2: The Kafka instances page with no instances listed.

     

    You'll be prompted for information. You will need to supply are the name of the Kafka instance: samurai-pizza-kafkas. You must also select a Cloud provider, a Cloud region, and the Availability zones. Click the Create instance button (see Figure 3) and you'll soon have an instance.

    Prompt to create kafka instance
    Figure 3: Creating a Kafka instance.

     

    Wait for the instance to reach Ready status(see Figure 4). Be patient; mine took about five minutes. You may need to refresh the screen to see the status change.

    panel showing kafka instance status as ready
    Figure 4: The panel that shows the Kafka instance is ready.

     

    At this point, we have an instance of Managed Kafka ready for our use. We can now create our topic, prices.

    Creating the topic

    The steps to creating our topic follow the familiar create-something-in-the-dashboard model:

    1. Open the parent: Click on the instance name, samurai-pizza-kafkas.
    2. Select to create the child: Click the Create Topic button.
    3. Create the child: Enter the topic name and accept the default values.

    But we're not going to do that. Instead, we'll use the command line. Open a terminal session and use the following command: rhoas login, as illustrated here:

    PS C:\Users\dschenck> rhoas login
    ⣷ Logging in...
    
     You are now logged in as "rhn-engineering-dschenck"

    Your browser will open to inform you that you are logged into your Kafka instance. And, of course, your user name will be different.

    Now back to the command line, where we'll spin just a little more rhoas magic to create our topic. We need three commands, which are illustrated:

    • rhoas kafka list: We are using this command to get the id of our kafka instance, which we'll use in the following command.
    • rhoas kafka use --id <<KAFKA_INSTANCE_ID>>: This command allows us to select our instance as the current instance, i.e., the one we want to use.
    • rhoas kafka topic create --name prices: The magic happens here. This command creates the topic using the default values.
    PS C:\Users\dschenck> rhoas kafka list
      ID (1)                 NAME                   OWNER                      STATUS   CLOUD PROVIDER   REGION
     ---------------------- ---------------------- -------------------------- -------- ---------------- -----------
      c6l32qnnd4k4as5jmvdg   samurai-pizza-kafkas   rhn-engineering-dschenck   ready    aws              us-east-1
    
    PS C:\Users\dschenck> rhoas kafka use --id c6l32qnnd4k4as5jmvdg
     Kafka instance "samurai-pizza-kafkas" has been set as the current instance.
    
    PS C:\Users\dschenck> rhoas kafka topic create --name prices
    Topic "prices" created in Kafka instance "samurai-pizza-kafkas":
    
    ...JSON removed to save space...
    

    That last command will spit out a JSON document, which simply defines the topic. You can see a more human-readable output by running the command rhoas kafka topic list (this is entirely optional).

    Meanwhile, in your Developer Sandbox...

    Let's get our application up and running. First, log into your sandbox cluster from the command line using the oc login command. For detailed instructions, refer to my article Access your Developer Sandbox for Red Hat OpenShift from the command line.

    We'll be running an image that's already been compiled from source code. The easiest way to do this is from your Sandbox cluster dashboard. Making sure you're in the Developer environment, select the +Add option shown in Figure 5.

    developer context add menu in openshift dashboard
    Figure 5: The developer context +Add menu in the OpenShift dashboard.

     

    This will present a list of options; we want to use the Container Image option, so simply click that panel (see Figure 6).

    OpenShift dashboard options for adding an application, with the Container Image panel highlighted.
    Figure 6: OpenShift dashboard options for adding an application.

     

    The next page lets us set the parameters for our application. In this case, the only value we need to supply is the image name. Enter quay.io/rhosak/quarkus-kafka-sb-quickstart:latest in the name field (see Figure 7) and click the Create button at the bottom of the page.

    OpenShift deploy image panel
    Figure 7: The OpenShift deploy image panel.

     

    OpenShift will do the rest: Import the image, start it in a container, create a Service, create a Route, and create a Deployment for the application.

    Just for fun, you can go into the pod and view the logs. You'll see that the application is throwing errors because it cannot connect to the expected Kafka instance. Time to fix that.

    Connecting the application to Kafka

    We have an application and an instance of Kafka with the topic (prices) our application expects. Now we need information in order to start using our Managed Kafka instance. Specifically, we need the API token from our Managed Kafka instance. We get that by going to https://cloud.redhat.com/openshift/token and copying the token to our local clipboard. You'll see a screen like the one shown in Figure 8.

    The OpenShift API token.
    Figure 8: The OpenShift API token.

     

    With the token available, we can run the following command at the command line to connect our Kafka instance to our application:

    rhoas cluster connect --token {your token pasted here}

    This, in turn, will return the YAML needed to create the service binding object that binds the Kafka instance to our application. The output will be similar to what's shown here:

    PS C:\Users\dschenck> rhoas cluster connect --token <<redacted>>
    ? Select type of service kafka
    This command will link your cluster with Cloud Services by creating custom resources and secrets.
    In case of problems please execute "rhoas cluster status" to check if your cluster is properly configured
    
    Connection Details:
    
    Service Type:                   kafka
    Service Name:                   samurai-pizza-kafkas
    Kubernetes Namespace:           rhn-engineering-dschenck-dev
    Service Account Secret:         rh-cloud-services-service-account
    ? Do you want to continue? Yes
     Token Secret "rh-cloud-services-accesstoken" created successfully
     Service Account Secret "rh-cloud-services-service-account" created successfully
    
    Client ID:     srvc-acct-b2f43cd6-da3f-41bd-9190-e1aade856103
    
    Make a copy of the client ID to store in a safe place. Credentials won't appear again after closing the terminal.
    
    You will need to assign permissions to service account in order to use it.
    For example for Kafka service you should execute the following command to grant access to the service account:
    
      $ rhoas kafka acl grant-access --producer --consumer --service-account srvc-acct-b2f43cd6-da3f-41bd-9190-e1aade856103 --topic all --group all
    
     kafka resource "samurai-pizza-kafkas" has been created
    Waiting for status from kafka resource.
    Created kafka can be already injected to your application.
    
    To bind you need to have Service Binding Operator installed:
    https://github.com/redhat-developer/service-binding-operator
    
    You can bind kafka to your application by executing "rhoas cluster bind"
    or directly in the OpenShift Console topology view.
    
     Connection to service successful.

    If you're using Developer Sandbox for Red Hat OpenShift, the Service Binding Operator is already installed.

    Important ACL rules

    You may notice in the middle of the output that you're instructed to update the Access Control List for your Kafka instance. This allows your application to use the instance and topic. This command is necessary; here's an example — your's will differ slightly:

    rhoas kafka acl grant-access --producer --consumer --service-account srvc-acct-b2f43cd6-da3f-41bd-9190-e1aade856103 --topic all --group all

    Where are we?

    At this point we have a Kafka instance (samurai-pizza-kafkas) with a topic (prices), and we have an application running in OpenShift that wants to use that instance. We've connected our cluster to Kafka, but not the individual application. That's next.

    The bind that ties

    We need to run a command to bind our Kafka instance to our application.

    rhoas cluster bind

    Here's an example:

    PS C:\Users\dschenck> rhoas cluster bind
    Namespace not provided. Using rhn-engineering-dschenck-dev namespace
    Looking for Deployment resources. Use --deployment-config flag to look for deployment configs
    ? Please select application you want to connect with quarkus-kafka-sb-quickstart
    ? Select type of service kafka
    Binding "samurai-pizza-kafkas" with "quarkus-kafka-sb-quickstart" app
    ? Do you want to continue? Yes
    Using ServiceBinding Operator to perform binding
     Binding samurai-pizza-kafkas with quarkus-kafka-sb-quickstart app succeeded
    PS C:\Users\dschenck>

    Are we there yet?

    Yes, we have arrived. At this point, the application is both producing and consuming events, using the Kafka instance and topic we created.

    Viewing the results

    You can open the application's route from the dashboard or use the command oc get routes to see the URL to the application. Paste the URL into your browser and append /prices.html to it to view the results.

    Here is the example URL:

    quarkus-kafka-sb-quickstart-rhn-engineering-dschenck-dev.apps.sandbox-m2.ll9k.p1.openshiftapps.com/prices.html
    

    Now you can watch the prices fluctuate wildly as the hedge fund managers and grassroots folks battle it out. The prices will update every five seconds, all flowing through your Managed Kafka instance.

    What just happened?

    A ton of stuff happened and was created behind the scenes here; let's break it down. After all, you may want to undo this, and knowing what happened and what was created is necessary.

    When you created a Kafka instance in Managed Kafka, it, well, created an instance of Kafka there. Pretty straightforward. That instance was also assigned a unique ID within the Red Hat Managed Services system, the key to which is the token, which we used later when we connected to it.

    When you ran rhoas use and selected your Kafka instance, you set the context of the rhoas CLI (on your local PC) to that instance, so any subsequent commands would go against that instance. That's important, because later you'll be connecting to your own (Sandbox) cluster; this is where you designate what gets connected to your cluster.

    The rhoas kafka topic create --name prices command simply created a topic within Kafka. If you're not clear on this, there's some fantastic material here to get you up to speed. Instant Kafka expertise.

    Running oc login connects your local machine to your—in this case, Sandbox—cluster. Note at this point in the tutorial, your local machine is "connected" to both your cluster and the Kafka instance. This allows the magic to happen.

    When you created the application in your cluster, it did a lot. It pulled the image from the registry and put it into your cluster—it's in the image streams section (Hint: Run the command oc get imagestreams). It created a pod to run the application. It created a Deployment. It created a Service. It created a Route. Why is this important? Because if you want to completely remove the application and everything associated with it—well, there are several objects.

    Note that the application has hard-coded values for the topic (prices). The instance name doesn't matter; the bootstrap server host and port are injected when we do the service binding. Which, incidentally, triggers OpenShift to replace the running pod with a new pod—one with all the correct Kafka information.

    Now the real magic happens when you run the rhoas cluster connect command with your token. This tells the Red Hat OpenShift Application Services (RHOAS) Operator, which is running in the Sandbox cluster (and which you need to have in any cluster you will use with Red Hat Managed Services offerings) to connect to the managed service—which it knows because of your rhoas login command—using the token supplied. That token, if you recall, identifies your Kafka instance. This connects the two.

    "Wait: Does that mean I can log into another cluster and also use the same Kafka instance there?"

    Yes. On the Kafka instance, multiple clusters. Distributed processing, cloud-native computing, microservices... all the buzzwords suddenly come to have real meaning.

    The rhoas cluster connect also creates some objects in your cluster. In this case, a Custom Resource of type KafkaConnection. You can see this at your command line by running oc get kafkaconnection. The ability of Kubernetes (and, by extension, OpenShift) to handle Custom Resources is a huge benefit. This is just one small example.

    Finally, a ServiceBinding object is created that defines the binding between the Kafka instance and your application—remember when you had to alter that YAML to include the application name (quarkus-kafka-sb-quickstart)?

    "Why are the stock prices displayed in Euros?"

    Okay, I took a little poetic license here in order to make this demo a little more fun. Besides, hedge fund managers battling a grassroots organization to control a stock price? That would never happen.

    Last updated: May 30, 2024

    Recent Posts

    • Create and enrich ServiceNow ITSM tickets with Ansible Automation Platform

    • Expand Model-as-a-Service for secure enterprise AI

    • OpenShift LACP bonding performance expectations

    • Build container images in CI/CD with Tekton and Buildpacks

    • How to deploy OpenShift AI & Service Mesh 3 on one cluster

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue