Skip to main content
Redhat Developers  Logo
  • AI

    Get started with AI

    • Red Hat AI
      Accelerate the development and deployment of enterprise AI solutions.
    • AI learning hub
      Explore learning materials and tools, organized by task.
    • AI interactive demos
      Click through scenarios with Red Hat AI, including training LLMs and more.
    • AI/ML learning paths
      Expand your OpenShift AI knowledge using these learning resources.
    • AI quickstarts
      Focused AI use cases designed for fast deployment on Red Hat AI platforms.
    • No-cost AI training
      Foundational Red Hat AI training.

    Featured resources

    • OpenShift AI learning
    • Open source AI for developers
    • AI product application development
    • Open source-powered AI/ML for hybrid cloud
    • AI and Node.js cheat sheet

    Red Hat AI Factory with NVIDIA

    • Red Hat AI Factory with NVIDIA is a co-engineered, enterprise-grade AI solution for building, deploying, and managing AI at scale across hybrid cloud environments.
    • Explore the solution
  • Learn

    Self-guided

    • Documentation
      Find answers, get step-by-step guidance, and learn how to use Red Hat products.
    • Learning paths
      Explore curated walkthroughs for common development tasks.
    • Guided learning
      Receive custom learning paths powered by our AI assistant.
    • See all learning

    Hands-on

    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.
    • Interactive labs
      Learn by doing in these hands-on, browser-based experiences.
    • Interactive demos
      Click through product features in these guided tours.

    Browse by topic

    • AI/ML
    • Automation
    • Java
    • Kubernetes
    • Linux
    • See all topics

    Training & certifications

    • Courses and exams
    • Certifications
    • Skills assessments
    • Red Hat Academy
    • Learning subscription
    • Explore training
  • Build

    Get started

    • Red Hat build of Podman Desktop
      A downloadable, local development hub to experiment with our products and builds.
    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.

    Download products

    • Access product downloads to start building and testing right away.
    • Red Hat Enterprise Linux
    • Red Hat AI
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat Developer Toolset

    References

    • E-books
    • Documentation
    • Cheat sheets
    • Architecture center
  • Community

    Get involved

    • Events
    • Live AI events
    • Red Hat Summit
    • Red Hat Accelerators
    • Community discussions

    Follow along

    • Articles & blogs
    • Developer newsletter
    • Videos
    • Github

    Get help

    • Customer service
    • Customer support
    • Regional contacts
    • Find a partner

    Join the Red Hat Developer program

    • Download Red Hat products and project builds, access support documentation, learning content, and more.
    • Explore the benefits

Harness the power of serverless containers in hybrid scenarios

September 26, 2024
Shubhayu Chongder
Related topics:
ContainersServerless
Related products:
Red Hat OpenShift Serverless

    In today's fast-paced business environment, efficient and agile application development is critical for success. One tool that can help achieve this is Red Hat OpenShift Serverless. In this article, we will explore what OpenShift Serverless is and how it can simplify application development for microservices and functions. 

    All hyperscalers (AWS, Azure, Google Cloud Platform, IBM) have their own serverless offerings as Function-as-a-Service (FaaS). They all provide almost similar functionalities. These have enabled a variety of use cases, but these are far from ideal for enterprise computing needs. Some shortcomings are:

    • Limited in terms of CPU, memory, and execution time.
    • Limited to no orchestration.
    • Cold start latency.
    • Vendor lock-in.

    Then comes the serverless container, which aims to solve most of these shortcomings. Serverless containers, as name suggests, bring the best of both the paradigms:

    • Serverless: enable abstracting application from underlaying infrastructure helping enterprises to innovate faster .
    • Containers: Applications can be packaged as OCI compliant container that can run anywhere removing vendor lock-in.

    Serverless computing continued to gain momentum as global serverless computing market size was crossed USD 7.6 billion in 2020 and is anticipated to exhibit a CAGR of 22.7% to reach USD 21.1 billion by the end of 2026.

    In the serverless container world, OpenShift Serverless plays a significant role by bringing together the benefits of serverless computing, with the containerization technology provided by Kubernetes. It is an add-on to the OpenShift platform which takes the development efficiency to the next level by providing a streamlined experience for creating, deploying, and scaling microservices, functions, and event-driven applications. 

    OpenShift Serverless architecture

    OpenShift Serverless is based on the upstream Knative project that provides primitives to create, build, and deploy microservices and functions. Figure 1 shows the architecture of OpenShift Serverless.  

    A diagram depicting the OpenShift Serverless architecture.
    Figure 1: Architecture of OpenShift Serverless.

    The main components of OpenShift Serverless architecture are:

    • Knative serving: Enables developers to create cloud-native applications using serverless architecture. It provides custom resource definitions (CRDs) which developers can use to deploy serverless containers, scale number of pods etc. 
    • Knative eventing: Provides the infrastructure for building and deploying event-driven applications. It enables developers to define event sources and sinks and provides a mechanism for routing events to functions and applications or other event sinks.

    Execution

    The first step is to install RedHat OpenShift Serverless Operator on the cluster. From the Operator Hub, select OpenShift Serverless operator and install with the default options. The status of the Operator installed can be seen by clicking the Operators → Installed Operators menu on the OpenShift console. See Figures 2 and 3.

    A view of searching for OpenShift Serverless within OperatorHub.
    Figure 2: In OperatorHub, search for "OpenShift Serverless".
    A view of the installed operators within OperatorHub.
    Figure 3: Installed Operators on OpenShift console.

    Next, we have to install Knative Serving and Knative Eventing CRDs. 

    Procedure

    Below are the steps to install Knative Serving by using the default settings in the KnativeServing custom resource (CR):

    1. In the Administrator perspective of the OpenShift Container Platform web console, we can navigate to Operators → Installed Operators. 
    2. Then change the Project drop-down at the top of the page and set to Project: knative-serving. 
    3. Next, click the Create Knative Serving option and in the Create Knative Serving page, we can install Knative Serving using the default settings by clicking Create. 

    We can install Knative Eventing CR in the same way, using the default settings. The only change is the project, which is set to Project: knative-eventing, and from the provided APIs we can select the Create Knative Eventing option to install Knative eventing. 

    The next step is to install the Knative CLI (kn) on the bootstrap machine that we are using to access the Red Hat OpenShift Container Platform cluster. The Knative CLI also supports interactions with OpenShift container platform. We can follow the official link to download and install Knative CLI. To verify the installation, we can run: 

    $ kn 

    Now we can create our first serverless container using the Knative CLI in Node.js as below:

    $ kn func create -c 

    The Knative CLI then will ask few details about the function we are going to create:

    Function name: serverless-demo-func
    Runtime: node
    Trigger: http 

    Output:

    root@ip-XX-X-X-XX:/home/ubuntu/shubhayu/serverless-demo# kn func create -c
    ? Function Path: serverless-demo-func
    ? Language Runtime: node
    ? Template: http
    Command:
    kn -l node
    Created node function in /home/ubuntu/shubhayu/serverless-demo/serverless-demo-func

    After providing above information, the CLI auto-generates the code for Serverless Container project named serverless-demo-func in the path given. 

    Once done, we'll first create a namespace in OpenShift Container Platform and then use the following commands to build and deploy the serverless application to the newly created namespace:

    $ oc new-project serverless-demo 
    $ kn func build
    $ kn func deploy -c 

    Output:

    root@ip-XX-X-X-XX:/home/ubuntu/shubhayu/serverless-demo# oc new-project serverless-demo
    Now using project "serverless-demo" on server "https://api.ocp-ai-cloud.ocp-ai-cloud.nextzlabs.com:6443".
    
    You can add applications to this project with the 'new-app' command. For example, try:
    
        oc new-app rails-postgresql-example
    
    to build a new example application in Ruby. Or use kubectl to deploy a simple Kubernetes application:
    
        kubectl create deployment hello-node --image=registry.k8s.io/e2e-test-images/agnhost:2.43 -- /agnhost serve-hostname

    After deployment, we can log in to the OpenShift console, change the project to serverless-demo that we have created earlier. Then, from the Developer →  Topology view, we should see the serverless application, as shown in Figure 4.

    A view of the serverless applications in the OpenShift console.
    Figure 4: Serverless application with 0 pods running.

    If we hit the route URL, it triggers the autoscaler to instantiate the pod and serve the request. The pod will terminate after 30 seconds of inactivity. See Figure 5.

    A view of the serverless application in the OpenShift console showing one pod running after receiving a request.
    Figure 5: Serverless application with 1 pod running after receiving a request.

    This is how we can run serverless application on OpenShift Serverless platform. Beyond autoscaling for HTTP requests, we can trigger the serverless containers using a variety of events such as Kafka messages, file upload to storage, timers for recurring jobs, and 100+ event sources like Salesforce, ServiceNow, e-mail, etc. 

    Conclusion

    To conclude, below are key features of OpenShift Serverless platform: 

    • Any programming language or runtime can be chosen, including Java, Python, Go, Node.js, etc. 
    • Immutable revisions: New features can be deployed by performing canary, A/B or blue-green testing with gradual traffic rollout with no sweat and following best practices. 
    • Automatic scaling: No need to configure number of replicas. Scale to zero when not in use and autoscale to thousands during peak, with built-in reliability and fault tolerance. 
    • Built for hybrid cloud: Portable serverless applications can be running anywhere OpenShift runs, be it on-premises or on any public cloud. 
    • Event-driven architectures: Can build loosely coupled and distributed applications connecting with a variety of built-in event sources.

    Related Posts

    • Deploying serverless Node.js applications on Red Hat OpenShift, Part 1

    • The evolution of serverless and FaaS: Knative brings change

    • Serverless and Knative: Installation through Deployment

    • Knative Cookbook: Building Effective Serverless Applications with Kubernetes and OpenShift

    • Move your APIs into the serverless era with Camel K and Knative

    Recent Posts

    • Every layer counts: Defense in depth for AI agents with Red Hat AI

    • Fun in the RUN instruction: Why container builds with distroless images can surprise you

    • Trusted software factory: Building trust in the agentic AI era

    • Build a zero trust AI pipeline with OpenShift and RHEL CVMs

    • Red Hat Hardened Images: Top 5 benefits for software developers

    What’s up next?

    Learn how Red Hat OpenShift implements the FaaS pattern using Knative via the Red Hat OpenShift Serverless operator.

    Start the activity
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Chat Support

    Please log in with your Red Hat account to access chat support.