Skip to main content
Redhat Developers  Logo
  • AI

    Get started with AI

    • Red Hat AI
      Accelerate the development and deployment of enterprise AI solutions.
    • AI learning hub
      Explore learning materials and tools, organized by task.
    • AI interactive demos
      Click through scenarios with Red Hat AI, including training LLMs and more.
    • AI/ML learning paths
      Expand your OpenShift AI knowledge using these learning resources.
    • AI quickstarts
      Focused AI use cases designed for fast deployment on Red Hat AI platforms.
    • No-cost AI training
      Foundational Red Hat AI training.

    Featured resources

    • OpenShift AI learning
    • Open source AI for developers
    • AI product application development
    • Open source-powered AI/ML for hybrid cloud
    • AI and Node.js cheat sheet

    Red Hat AI Factory with NVIDIA

    • Red Hat AI Factory with NVIDIA is a co-engineered, enterprise-grade AI solution for building, deploying, and managing AI at scale across hybrid cloud environments.
    • Explore the solution
  • Learn

    Self-guided

    • Documentation
      Find answers, get step-by-step guidance, and learn how to use Red Hat products.
    • Learning paths
      Explore curated walkthroughs for common development tasks.
    • Guided learning
      Receive custom learning paths powered by our AI assistant.
    • See all learning

    Hands-on

    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.
    • Interactive labs
      Learn by doing in these hands-on, browser-based experiences.
    • Interactive demos
      Click through product features in these guided tours.

    Browse by topic

    • AI/ML
    • Automation
    • Java
    • Kubernetes
    • Linux
    • See all topics

    Training & certifications

    • Courses and exams
    • Certifications
    • Skills assessments
    • Red Hat Academy
    • Learning subscription
    • Explore training
  • Build

    Get started

    • Red Hat build of Podman Desktop
      A downloadable, local development hub to experiment with our products and builds.
    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.

    Download products

    • Access product downloads to start building and testing right away.
    • Red Hat Enterprise Linux
    • Red Hat AI
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat Developer Toolset

    References

    • E-books
    • Documentation
    • Cheat sheets
    • Architecture center
  • Community

    Get involved

    • Events
    • Live AI events
    • Red Hat Summit
    • Red Hat Accelerators
    • Community discussions

    Follow along

    • Articles & blogs
    • Developer newsletter
    • Videos
    • Github

    Get help

    • Customer service
    • Customer support
    • Regional contacts
    • Find a partner

    Join the Red Hat Developer program

    • Download Red Hat products and project builds, access support documentation, learning content, and more.
    • Explore the benefits

Red Hat Serverless Operator usage and troubleshooting in OpenShift 4

December 13, 2024
Francisco De Melo Junior
Related topics:
OperatorsServerless
Related products:
Red Hat OpenShift Container PlatformRed Hat OpenShift Serverless

    Red Hat Serverless Operator provides Knative features directly in Red Hat OpenShift.  

    Let's start by exploring: What is serverless? Serverless refers to running back-end programs and processes in the cloud. Serverless works on an as-used basis, meaning that companies only use what they pay for. Knative is a platform-agnostic solution for running serverless deployments.

    Red Hat OpenShift Serverless, built on top of the Knative project for Kubernetes and its Red Hat FaaS environments.

    More so, Serverless computing offerings typically fall into two groups:

    • Backend-as-a-Service (BaaS): BaaS gives developers access to a variety of third-party services and apps. For instance, a cloud-provider may offer authentication services, extra encryption, cloud-accessible databases, and high-fidelity usage data. 
    • Function-as-a-Service (FaaS): Developers still write custom server-side logic, but it’s run in containers fully managed by a cloud services provider.

    Figure 1 depicts the serverless components.

    Knative components
    Figure 1: Serverless components (diagram based on upstream documentation).

    Installation

    Selecting Red Hat OpenShift Serverless as shown in Figure 2 will be installed in a new namespace called openshift-serverless.

    Red Hat Openshift Serverless
    Figure 2: Select Red Hat OpenShift Serverless.

    This Operator will provide three APIs: Knative Serving, Knative Eventing, and Knative Kafka as shown in Figure 3.

    Serverless APIs
    Figure 3: Provided APIs in Red Hat OpenShift Serverless.

    After its installation, the following will happen:

    • A new namespace called openshift-serverless will be created, and in it, there will be three pods: knative-openshift, knative-openshift-ingress, and knative-operator-webhook.
    • A new namespace called knative-serving will be created, without pods.
    • A new namespace called knative-eventing will be created, without pods.

    As soon as one Serving or Eventing is created, the Serverless menu will appear on the left side of the main menu (Figure 4).

    Openshift Serving
    Openshift Serving
    Figure 4: OpenShift Serving.

    Knative Serving

    This type of resource allows the user to define and control how the serverless workload behaves on the cluster.

    Knative Serving is a platform for streamlined application deployment, traffic-based auto-scaling from zero to N, and traffic-split rollouts, as the documentation explains.

    The serving can only be created in the serving namespace, otherwise, it will fail as below:

    - lastTransitionTime: '2024-08-01T18:26:56Z'
    message: 'Install failed with message: Knative Serving must be installed into the namespace "knative-serving"'

    Knative Eventing

    Knative Eventing is a collection of APIs that enable you to use an event-driven architecture with your applications. You can use these APIs to create components that route events from event producers (known as sources) to event consumers (known as sinks) that receive events. Sinks can also be configured to respond to HTTP requests by sending a response event.

    An event-driven application platform that leverages CloudEvents with a simple HTTP interface.

    Knative Kafka

    Knative Kafka is an extension to Knative Eventing, merging HTTP accessibility with Apache Kafka's proven efficiency and reliability.

    Other CustomResources/Definitions

    Upon the installation of the Serverless Operator and then the creation of Knative eventing and Knative serving other APIs (CustomResourceDefinitions) will be displayed. Below one sees that the default three main APIs were installed and then the others, such as pingsources.sources.knative.dev came right after together with the others:

    $ oc get crd  | grep knative
    apiserversources.sources.knative.dev                              2024-12-10T02:01:39Z
    brokers.eventing.knative.dev                                      2024-12-10T02:01:39Z
    certificates.networking.internal.knative.dev                      2024-12-10T02:01:33Z
    channels.messaging.knative.dev                                    2024-12-10T02:01:39Z
    clusterdomainclaims.networking.internal.knative.dev               2024-12-10T02:01:33Z
    configurations.serving.knative.dev                                2024-12-10T02:01:33Z
    containersources.sources.knative.dev                              2024-12-10T02:01:39Z
    domainmappings.serving.knative.dev                                2024-12-10T02:01:34Z
    eventtypes.eventing.knative.dev                                   2024-12-10T02:01:39Z
    images.caching.internal.knative.dev                               2024-12-10T02:01:33Z
    ingresses.networking.internal.knative.dev                         2024-12-10T02:01:34Z
    inmemorychannels.messaging.knative.dev                            2024-12-10T02:01:45Z
    knativeeventings.operator.knative.dev                             2024-12-10T02:00:22Z
    knativekafkas.operator.serverless.openshift.io                    2024-12-10T02:00:22Z
    knativeservings.operator.knative.dev                              2024-12-10T02:00:23Z
    metrics.autoscaling.internal.knative.dev                          2024-12-10T02:01:34Z
    parallels.flows.knative.dev                                       2024-12-10T02:01:39Z
    pingsources.sources.knative.dev                                   2024-12-10T02:01:39Z
    podautoscalers.autoscaling.internal.knative.dev                   2024-12-10T02:01:34Z
    revisions.serving.knative.dev                                     2024-12-10T02:01:34Z
    routes.serving.knative.dev                                        2024-12-10T02:01:34Z
    sequences.flows.knative.dev                                       2024-12-10T02:01:40Z
    serverlessservices.networking.internal.knative.dev                2024-12-10T02:01:34Z
    services.serving.knative.dev                                      2024-12-10T02:01:34Z
    sinkbindings.sources.knative.dev                                  2024-12-10T02:01:40Z
    subscriptions.messaging.knative.dev                               2024-12-10T02:01:40Z
    triggers.eventing.knative.dev                                     2024-12-10T02:01:40Z

    The new CustomResourceDefinitions (CRDs) build upon the three main CustomResources (CRs: knative eventing, knative serving, knative kafka) to provide advanced features such as Ping Sources, which is an event source that produces events with a fixed payload on a specified cron schedule. 

    Demo

    The steps below basically reiterate the learning path Create an OpenShift Serverless function by Don Schenck. See below:

    1. Install the OpenShift Serverless Operator.
    2. Verify the Serverless Option on the menu is created (see above).
    3. Deploy the Photobooth application.
    4. Create a Serving $oc get kservice, which can be trivial for this demo:

      apiVersion: operator.knative.dev/v1beta1
      kind: KnativeServing
      metadata:
        name: knative-serving
        namespace: knative-serving
      spec:
        controller-custom-certs:
          name: ''
          type: ''
        registry: {}

      After the Serving creation, several pods will appear in the knative-serving namespace:

      $ oc project knative-serving
      $ oc get pod
      NAME                              READY   STATUS    RESTARTS   AGE
      activator-7b7d4557cc-4rfk9        2/2     Running   0          16m
      activator-7b7d4557cc-jsf5s        2/2     Running   0          17m
      autoscaler-f794648c6-h4p2x        2/2     Running   0          17m
      autoscaler-f794648c6-rknhg        2/2     Running   0          17m
      autoscaler-hpa-659f6f48b7-gm9jx   2/2     Running   0          17m
      autoscaler-hpa-659f6f48b7-z4ddb   2/2     Running   0          17m
      controller-85c7c98dfc-2bgbg       2/2     Running   0          16m
      controller-85c7c98dfc-gp5kp       2/2     Running   0          17m
      webhook-7c4994d865-fqdlr          2/2     Running   0          16m
      webhook-7c4994d865-l2mvt          2/2     Running   0          17m
    5. Create a Service ($oc get ksvc). This must be created after the Serving:

      apiVersion: serving.knative.dev/v1
      kind: Service
      metadata:
        name: imageoverlay
        namespace: knative-serving
      spec:
        template:
          spec:
            containers:
              - image: quay.io/rhdevelopers/imageoverlay:latest
    6. Get the route using $oc get ksvc:

      $ oc get ksvc
      NAME           URL                                                                                        LATESTCREATED        LATESTREADY          READY   REASON
      imageoverlay   https://imageoverlay-knative-serving.apps.rosa.example.com   imageoverlay-00001   imageoverlay-00001   True    
    7. Apply the ksvc in the field Service URL Setting: https://imageoverlay-knative-serving.apps.rosa.example.com.
    8. Take the picture (Figure 5).
    Serverless!
    Serverless application Demo
    Figure 5: Serverless application demo.

    Customization

    For customizing settings, such as the container settings, one can do as explained in the solution Setting Serverless Operator container limits and requests in OCP 4. For example:

    $ oc get kservice -o yaml
    apiVersion: serving.knative.dev/v1
    kind: Service
    metadata:
      name: showcase
      namespace: knative-serving
      annotations:
        serving.knative.dev/creator: cluster-admin
        serving.knative.dev/lastModifier: cluster-admin
    spec:
      template:
        metadata:
          annotations:
            queue.sidecar.serving.knative.dev/resourcePercentage: "40" <---- 40% of the user-container will be queue-proxy

    Troubleshooting

    For troubleshooting, we usually verify:

    • The CustomResources: serving, events, $oc get kservice, and $oc get kevent.
    • Inspect bundle, where the pod YAMLs and pod logs will be.
    • In most cases we ask the serverless must-gather as well, but not in all cases.

    Learn more

    This article covers the introduction, installation, customization, and finally troubleshooting of Red Hat OpenShift Serverless. 

    Don Schenck's learning path Create an OpenShift Serverless function is an excellent resource for learning more. 

    Additional resources

    For any other specific inquiries, please open a case with Red Hat support. Our global team of experts can help you with any issues.

    Special thanks to Jonathan Edwards for the excellent collaboration last year on Knative, Alexander Barbosa
    for the contribution to this article with the diagrams/reviews, and finally Josh Brandenburg for allocating time for my research on this topic as well.

    Related Posts

    • Write a Quarkus function in two steps on Red Hat OpenShift Serverless

    • How OpenShift Serverless Logic evolved to improve workflows

    • Create your first serverless function with Red Hat OpenShift Serverless Functions

    • Serverless applications made faster and simpler with OpenShift Serverless GA

    Recent Posts

    • Every layer counts: Defense in depth for AI agents with Red Hat AI

    • Fun in the RUN instruction: Why container builds with distroless images can surprise you

    • Trusted software factory: Building trust in the agentic AI era

    • Build a zero trust AI pipeline with OpenShift and RHEL CVMs

    • Red Hat Hardened Images: Top 5 benefits for software developers

    What’s up next?

    Learn how to implement a back-end function and a front-end web application, then use a FaaS object to process data in this Learning Path.

    Start the activity
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Chat Support

    Please log in with your Red Hat account to access chat support.