Skip to main content
Redhat Developers  Logo
  • AI

    Get started with AI

    • Red Hat AI
      Accelerate the development and deployment of enterprise AI solutions.
    • AI learning hub
      Explore learning materials and tools, organized by task.
    • AI interactive demos
      Click through scenarios with Red Hat AI, including training LLMs and more.
    • AI/ML learning paths
      Expand your OpenShift AI knowledge using these learning resources.
    • AI quickstarts
      Focused AI use cases designed for fast deployment on Red Hat AI platforms.
    • No-cost AI training
      Foundational Red Hat AI training.

    Featured resources

    • OpenShift AI learning
    • Open source AI for developers
    • AI product application development
    • Open source-powered AI/ML for hybrid cloud
    • AI and Node.js cheat sheet

    Red Hat AI Factory with NVIDIA

    • Red Hat AI Factory with NVIDIA is a co-engineered, enterprise-grade AI solution for building, deploying, and managing AI at scale across hybrid cloud environments.
    • Explore the solution
  • Learn

    Self-guided

    • Documentation
      Find answers, get step-by-step guidance, and learn how to use Red Hat products.
    • Learning paths
      Explore curated walkthroughs for common development tasks.
    • Guided learning
      Receive custom learning paths powered by our AI assistant.
    • See all learning

    Hands-on

    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.
    • Interactive labs
      Learn by doing in these hands-on, browser-based experiences.
    • Interactive demos
      Click through product features in these guided tours.

    Browse by topic

    • AI/ML
    • Automation
    • Java
    • Kubernetes
    • Linux
    • See all topics

    Training & certifications

    • Courses and exams
    • Certifications
    • Skills assessments
    • Red Hat Academy
    • Learning subscription
    • Explore training
  • Build

    Get started

    • Red Hat build of Podman Desktop
      A downloadable, local development hub to experiment with our products and builds.
    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.

    Download products

    • Access product downloads to start building and testing right away.
    • Red Hat Enterprise Linux
    • Red Hat AI
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat Developer Toolset

    References

    • E-books
    • Documentation
    • Cheat sheets
    • Architecture center
  • Community

    Get involved

    • Events
    • Live AI events
    • Red Hat Summit
    • Red Hat Accelerators
    • Community discussions

    Follow along

    • Articles & blogs
    • Developer newsletter
    • Videos
    • Github

    Get help

    • Customer service
    • Customer support
    • Regional contacts
    • Find a partner

    Join the Red Hat Developer program

    • Download Red Hat products and project builds, access support documentation, learning content, and more.
    • Explore the benefits

Deploying Microservices on OpenShift using Kubernetes

August 16, 2016
Christopher Tozzi
Related topics:
KubernetesContainersMicroservices
Related products:
Red Hat OpenShift Container Platform

    You’ve heard of microservices. You’ve heard of OpenShift. You’ve heard of Kubernetes. Actually, you may already have considerable experience with each of these three concepts and tools.

    But do you know how to combine all of them in order to deploy microservices effectively? If not, this article is for you. Below, I’ll explain how microservices, OpenShift and Kubernetes fit together, and provide an overview of how you can leverage the orchestration tools provided by OpenShift and Kubernetes in order to build and deploy microservices at scale.

    What are microservices?

    First things first: Let’s define what "microservice" means for our purposes.

    Different people use this term to refer to different things. Some, especially in the media, seem to treat microservices as merely a synonym for containers. But that’s a simplistic interpretation.

    Or, if you’re old school, you may think of the monolithic kernel vs. microkernel debate (which was made famous by the Torvalds-Tanenbaum flame war of 1992) when you hear someone mention microservices. Microkernels are a similar concept, but they’re not what microservices are all about today.

    Personally, I think Martin Fowler’s definition of microservices sums them up best. In his telling, a microservice is a type of service that has five main features:

    • Each instance of the service is independently deployable.
    • Each part of the service is scalable.
    • The service is composed of modular parts.
    • Each part of the service can be implemented in different ways (using different programming languages, for instance), but all parts are compatible with one another.
    • Each service can be managed by a different team.

    These characteristics distinguish microservices from monolithic architectures. Within the latter, multiple software functions are fused into a single process, which is neither modular, nor scalable, nor able to be deployed easily across multiple hosts.

    How does OpenShift handle services?

    To understand how microservices fit into OpenShift, you have to understand the basics of the OpenShift architecture, and how OpenShift manages the apps running on it.

    In essence, an OpenShift cluster is composed of the following parts, which all interact with one another in various ways:

    • Nodes: These are the physical or virtual servers that host OpenShift. You can (and should, if you want scalability and high reliability) have multiple nodes in your OpenShift cluster.
    • Pods: Just like on Kubernetes, OpenShift pods are groups of containers (or a single container, in some cases) that work together. Each pod has a unique internal IP address, and can access multiple ports on that address.
    • Containers: These are the things that run inside pods. You can have multiple containers of different types inside a single pod; for example, you could have a web server container running alongside a data container to store information for the web server. Each container can be assigned to a different port on its pod’s IP address.
    • Services: These are what you get when you put multiple pods together to deliver a single app, such as a web server complete with both logic and data functions.
    • Routes: Each OpenShift service receives an internal IP address and port number. If you want to expose a service to the external network or the public Internet, you use routes. Routes translate between the internal and external networking configuration of your service. Routes are kind of like good old NAT servers, if you want to think of them that way.

    These are the essentials that you need to understand about the OpenShift architecture to get started with microservices. For the longer version of all of the above, check out the OpenShift documentation.

    How Kubernetes makes microservices easy on OpenShift

    OK, so where does Kubernetes come in?

    The short answer is quietly. By that I mean that Kubernetes does most of its work in the background, without requiring much effort on the part of administrators. That’s the beauty of using Kubernetes to manage your OpenShift microservices.

    More specifically, Kubernetes automatically performs several key functions, which ensure that your microservices run smoothly. Those functions include:

      • Load balancing. Kubernetes automatically decides, based on policies that you define ahead of time, which of your OpenShift nodes need pods placed upon them.
      • High availability. Kubernetes automatically detects when a node fails and reassigns the pods from the failed node. The result is automatic failover, which provides high availability for your apps, even when parts of your infrastructure are hit with problems like the network going down or a loss of power.

     

    • Scaling. By automating the placement of pods according to policies set by a replication controller, Kubernetes also ensures that your services remain scalable as demand fluctuates.

    On the scaling note, it’s also worth bearing in mind that Kubernetes lets you scale different parts of your services independently. For example, if you have a container-based web server running on OpenShift and you need to increase the instances of the web server itself, but don’t require more of the data containers running alongside it, Kubernetes can scale up just the web server part of the service.

    To put it another way, the scalability of your OpenShift cluster is granular. That reduces complexity and optimizes resource usage.

    Conclusion

    By now, the advantages of pairing microservices with a Kubernetes-powered OpenShift cluster should be clear. You get an infrastructure that is agile from top to bottom. You also maximize resource efficiency, and minimize the amount of administrative effort required to keep things running.

    About Christopher Tozzi

    Chris Tozzi has worked as a journalist and Linux systems administrator. He has particular interests in open source, agile infrastructure and networking. He is Senior Editor of content and a DevOps Analyst at Fixate IO.

    Last updated: March 16, 2018

    Recent Posts

    • Red Hat Hardened Images: Top 5 benefits for software developers

    • How EvalHub manages two-layer Kubernetes control planes

    • Tekton joins the CNCF as an incubating project

    • Federated identity across the hybrid cloud using zero trust workload identity manager

    • Confidential virtual machine storage attack scenarios

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Chat Support

    Please log in with your Red Hat account to access chat support.