Serverless: Code, not infrastructure

Serverless is a cloud computing model that lets you build and manage applications without having to worry about managing the underlying infrastructure.

Cloud native_Topic feature image

What is serverless computing?

In simple terms, serverless takes care of scaling, resource allocation, and other operational tasks, so you can focus on your code. The open source Knative project bridges the gap between the power of Kubernetes and the simplicity of serverless computing and event-driven applications. On Knative, you can deploy any modern application workload, such as monolithic applications, microservices, or even tiny functions.

Power up AI and other applications with serverless

Why use serverless?

Serverless offers greater scalability, improved flexibility, and quicker production paths. Knative is a new way to look at services and functions. Previously, there were debates about when to use a service and when to use a function, but with Knative, they are the same—any service can scale to zero, a core tenet of "serverless" computing. You can build code without worrying about whether it's being treated as a service or a function; Knative manages that automatically. Serverless also eliminates the need for developers to provision or manage back-end servers.

Knative: What developers need to know

Accelerate AI workloads with serverless

Serverless can enhance AI/ML models by enabling them to run alongside applications, using demand-based scaling or event-driven orchestration to build larger, domain-specific business applications.

Red Hat OpenShift Serverless, based on the Knative project, empowers AI applications with a flexible, scalable, and event-driven Kubernetes platform. It enables real-time data processing, seamless integration, and dynamic scaling to meet the evolving needs of modern AI workloads.

Learn more: AI/ML workloads

What is the difference between Knative and Kubernetes deployments?

Traditional Kubernetes deployment

In a traditional Kubernetes deployment, a container image is hosted somewhere, and a YAML file describes the deployment. That image is applied to the cluster, and the deployment creates a ReplicaSet that then creates the number of pods specified. Then, a service is created that matches the labels of the pods.

Additional steps might be required for various situations, such as having to create routes if the service can’t access a cloud balancer. All in all, a traditional Kubernetes deployment involves quite a few steps, including long YAML files and many moving parts.

kubernetes_knative_gfx_traditional

Knative deployment

With serverless, you don’t have to worry about infrastructure at all. You provide a container image, run it on a cluster, and that’s it—everything else is taken care of for you.

With Knative, deploying is even simpler: you write a small resource file, called a Knative Service, that tells the system “I want to run this image.” Once applied to your cluster, the Knative Operator automatically creates all the necessary resources: the deployment, service, route (if needed), and a configuration resource. The configuration resource can also manage revisions, making it easy to roll back to a previous version when needed.

kubernetes_knative_gfx_kubectl

What capabilities do serverless and Knative offer?

Knative provides a number of middleware components that you can use to extend Kubernetes. Knative uses the basic building blocks of Kubernetes and adds new blocks to it, so everything stays within the Kubernetes paradigm.

Scale to zero

If there’s no traffic on your pod, nothing will be running. This means no memory and less CPU, so you’re saving money and resources.

Scale from zero

If you have a traffic spike for whatever reason, Knative will scale everything up automatically.

Progressive rollouts

If you want to do blue/green or canary deployments, you can do that with Knative and its revisions and traffic-splitting capabilities.

Integration with CI/CD

Knative, through its compatibility with tools like Tekton, enables seamless integration into existing CI/CD pipelines, automating deployment and scaling of serverless applications.

Event integration

Knative empowers seamless event-driven architectures, allowing services to react to and process events from diverse sources with flexible routing and scaling.

Code-centric execution

Knative enables the deployment of code snippets as serverless functions, abstracting away container management and focusing solely on the execution of the function's logic.

How does Knative enable serverless applications?

Serverless containers

Build and deploy containerized applications without the complexities of managing infrastructure. By abstracting away the underlying infrastructure, Knative Serving allows you to focus on writing code, while the platform automatically scales your applications based on demand.

Watch: Knative serving

Event mesh

Set up and manage event-driven architectures to make your applications more responsive, scalable, and resilient. By decoupling components and enabling asynchronous communication, Knative Eventing helps you build complex systems that can adapt to changing conditions and handle unexpected events gracefully.

Watch: Knative Eventing

Serverless functions

Serverless functions are powerful tools for building and deploying event-driven workloads on Kubernetes. They simplify the process of creating, building, and deploying stateless, event-driven functions.

Watch: Knative functions 

Serverless workflows

Define and execute event-driven, serverless applications effortlessly. Knative enables you to automate tasks, integrate services, and scale workflows without the hassle of managing the underlying infrastructure.

Watch: Serverless logic

What can you do with serverless?

Run serverless workloads

Learn the basics of running serverless workloads on OpenShift Serverless in this self-paced tutorial.

Launch the lab

Process IoT telemetry data

This video introduces the basics of Knative Serving, Eventing, and Functions, then walks through an example use case where telemetry data is collected from simulated vehicles, processed with OpenShift Serverless, and used to train a machine learning model.

Start the demo

Build apps around the event mesh

This solution pattern shows a simple yet elegant, correct, and comprehensive way of building software for both greenfield and legacy projects. This solution leverages the Red Hat OpenShift Serverless Eventing component, which implements the event mesh pattern.

Try the solution pattern

Sentiment analysis

With event-driven architecture (EDA), this system connects to and consumes data from a number of systems, services, and data sources by responding to triggering events.

Try the solution pattern

See serverless in action

Technical demos

Knative and serverless technology

Phil Knezevich presents an interactive overview of Knative and serverless technology. 

Start the demo

How to create a Node.js serverless function with OpenShift

See how to create a function project based on a Node.js framework and use a container platform with function serverless capability.

Start the demo

OpenShift Serverless functions with Golang

Explore how to create a simple function project using Go, leveraging serverless function capabilities.

Start the demo

Introduction to eventing with OpenShift Serverless

Learn how to deploy an event-driven application within OpenShift Serverless. This video also introduces Knative Eventing and how Knative relates to OpenShift Serverless.

Start the demo

Automate pull request workflows

Learn how to use OpenShift Serverless and OpenShift Pipelines (based on Tekton) to build an automated workflow for GitHub pull requests, generating preview URLs and automatically building serverless containers.

Start the demo

Featured serverless resources

Article OpenShift Serverless Functions 2

Evolve an application into a serverless model using Red Hat OpenShift. Learn...

Learning Path secure coding - simple
Learn how to implement a back-end function and a front-end web application,...
Interactive Tutorial

Learn how to deploy and run applications that scale up, or scale to zero,...

Featured image for Java topics.

Java is an ideal language to create serverless functions because of its...

Build and run serverless applications with Red Hat OpenShift Serverless
quarkus-spring-devs_book-card

Featured serverless videos

Serverless articles

What I learned about Kubernetes and Knative Serverless

Explore on-demand sessions on Kubernetes-native serverless.

Read now

The benefits of serverless for the banking and financial industries

Learn how serverless can help developers working in the banking and financial industries.

Read now

Installing the Serverless Operator, Knative CLI, Knative Serving, and Knative Eventing

Learn how to make serverless seamless with Knative.

Read now

How SVA used OpenShift Serverless to kickstart cloud-native adoption and patterns

Learn how SVA, a Red Hat partner, turned to OpenShift Serverless to kickstart cloud-native adoption and patterns in a regulated environment.

Read now

Serverless communities

Knative project

Knative is an open source, enterprise-level solution to build serverless and event-driven applications.

Learn more

Serverless Workflow 

Serverless Workflow presents an open source, community-driven ecosystem tailored for defining and executing DSL-based workflows.

Learn more

SonataFlow

SonataFlow is an open source project for building cloud-native workflow applications. Sonataflow is the implementation of CNCF Serverless Workflow specifications.

Learn more