Skip to main content
Redhat Developers  Logo
  • AI

    Get started with AI

    • Red Hat AI
      Accelerate the development and deployment of enterprise AI solutions.
    • AI learning hub
      Explore learning materials and tools, organized by task.
    • AI interactive demos
      Click through scenarios with Red Hat AI, including training LLMs and more.
    • AI/ML learning paths
      Expand your OpenShift AI knowledge using these learning resources.
    • AI quickstarts
      Focused AI use cases designed for fast deployment on Red Hat AI platforms.
    • No-cost AI training
      Foundational Red Hat AI training.

    Featured resources

    • OpenShift AI learning
    • Open source AI for developers
    • AI product application development
    • Open source-powered AI/ML for hybrid cloud
    • AI and Node.js cheat sheet

    Red Hat AI Factory with NVIDIA

    • Red Hat AI Factory with NVIDIA is a co-engineered, enterprise-grade AI solution for building, deploying, and managing AI at scale across hybrid cloud environments.
    • Explore the solution
  • Learn

    Self-guided

    • Documentation
      Find answers, get step-by-step guidance, and learn how to use Red Hat products.
    • Learning paths
      Explore curated walkthroughs for common development tasks.
    • Guided learning
      Receive custom learning paths powered by our AI assistant.
    • See all learning

    Hands-on

    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.
    • Interactive labs
      Learn by doing in these hands-on, browser-based experiences.
    • Interactive demos
      Click through product features in these guided tours.

    Browse by topic

    • AI/ML
    • Automation
    • Java
    • Kubernetes
    • Linux
    • See all topics

    Training & certifications

    • Courses and exams
    • Certifications
    • Skills assessments
    • Red Hat Academy
    • Learning subscription
    • Explore training
  • Build

    Get started

    • Red Hat build of Podman Desktop
      A downloadable, local development hub to experiment with our products and builds.
    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.

    Download products

    • Access product downloads to start building and testing right away.
    • Red Hat Enterprise Linux
    • Red Hat AI
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat Developer Toolset

    References

    • E-books
    • Documentation
    • Cheat sheets
    • Architecture center
  • Community

    Get involved

    • Events
    • Live AI events
    • Red Hat Summit
    • Red Hat Accelerators
    • Community discussions

    Follow along

    • Articles & blogs
    • Developer newsletter
    • Videos
    • Github

    Get help

    • Customer service
    • Customer support
    • Regional contacts
    • Find a partner

    Join the Red Hat Developer program

    • Download Red Hat products and project builds, access support documentation, learning content, and more.
    • Explore the benefits

Red Hat Summit: Functions as a Service with OpenWhisk and OpenShift

May 16, 2018
Doug Tidwell
Related topics:
KubernetesServerless
Related products:
Red Hat OpenShift Container PlatformRed Hat OpenShift

    Serverless computing (often called Functions-as-a-Service, or FaaS) is one of the hottest emerging technologies today. The OpenWhisk project, currently in incubation at Apache, is an open-source implementation of FaaS that lets you create functions that are invoked in response to events. Our own Brendan McAdams gave a presentation and demo that explained the basics of serverless, how the OpenWhisk project works, and how to run OpenWhisk in OpenShift.

    Brendan outlined the three properties of a serverless / FaaS platform:

    1. It responds to events by invoking functions
    2. Functions are loaded and executed on demand
    3. Functions can be chained together with triggered events from outside the FaaS platform itself.

    Before we go on, a terminology note. "Functions as a Service" and "Serverless" are normally used interchangeably. Lately, however, people are also using the word "serverless" to mean "anything that doesn't require you to fire up and manage a VM1." To be clear, what we're talking about here is FaaS.

    It's useful to quote the official definition of OpenWhisk from the project's website:

    ...[A] serverless, open source cloud platform that executes functions in response to events at any scale.

    The event-driven nature of OpenWhisk is powerful, but its ability to scale makes certain applications economically feasible for the first time. Without a FaaS platform, you'd have to provision the necessary resources to handle whatever loads the world might throw at your application. With FaaS, you don't provision anything. You simply tell the system, "Here are some things that might happen, and here's the code you should run when they do." The architecture of your app is defined declaratively, and the FaaS provider has to deliver the resources to handle the events.

    Another part of the power of FaaS is that the events can come from anywhere, including sources outside the FaaS platform. For example, a change to a database might generate an event. OpenWhisk allows you to create your own events as well. Red Hat is working on an AMQP event provider, and others have built code that generates events from things like git commits or posts to a Slack channel. Finally, because all functions in OpenWhisk have a REST API, they can also be invoked directly by another piece of code or from the wsk command-line tool. Brendan's demos used the command line extensively.

    A trigger is a class of events, such as all of the changes that are made to a database. Once a trigger is defined, you can create rules to determine which function(s) should be invoked when an event occurs. In OpenWhisk terminology, a function is called an action. All of the terminology is explained in a discussion of the high-level programming model on the project's website. For simplicity's sake, we'll continue to call a function a function.

    The universal data format in the world of OpenWhisk is JSON and the most commonly supported languages are Java, Node.js, and Python. For those three languages you'll use the GSON library, native JSON, and Python dictionaries respectively. As you would expect, support for other languages is actively being developed, including Swift and PHP.

    It's important to remember is that your functions are stateless. They are given some data generated by an event, and then they process that data and return the results. If you call the function again, it has no knowledge of what happened before. If many copies of the same function are running at the same time, they have no knowledge of each other.

    If you're familiar with Unix pipes, you'll grasp the significance of sequences right away. You can create a new function by composing a sequence of functions that should be invoked in response to an event. The output of the first function becomes the input to the second, the output of the second becomes the input to the third, and so on. In Brendan's example, the first function took a name ({"name": "Brandon McAdams"}) and returned a reversed version of that name ({"name": "McAdams, Brandon"}). That JSON was then passed to a Hello World function that returned a greeting with the reversed name returned by the first function ({"greeting": "Hello, McAdams, Brandon"}). Keep in mind that each function in a sequence needs to understand the JSON from the previous one. If you added a third function to the example sequence, it would need to look for a parameter named greeting.

    The entire programming model is very flexible. In response to an event, the system can a single function, a sequence of functions, or multiple functions.

    As you'd expect from the title of the session, Brandon discussed how to run OpenWhisk inside OpenShift. We have a GitHub repo that contains the templates and Docker images you need to deploy OpenWhisk to your OpenShift project. Follow the instructions and you should be up and running with your very own FaaS platform.

    Learning how to build useful applications with a set of serverless functions is a crucial skill for any modern developer. Take a look at our GitHub repo and get started today!


    1 To cite a specific example from the world of containers, imagine a Kubernetes environment in which one of the nodes is in fact a cloud that provisions and deprovisions VMs automatically to host pods in the cluster. You can argue the words "serverless computing" describe that scenario because you're not working with the VMs, but that's not what we're talking about. We're firmly and fundamentally focused on functions here.

    Last updated: September 3, 2019

    Recent Posts

    • Trusted software factory: Building trust in the agentic AI era

    • Build a zero trust AI pipeline with OpenShift and RHEL CVMs

    • Red Hat Hardened Images: Top 5 benefits for software developers

    • How EvalHub manages two-layer Kubernetes control planes

    • Tekton joins the CNCF as an incubating project

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Chat Support

    Please log in with your Red Hat account to access chat support.