Skip to main content
Redhat Developers  Logo
  • AI

    Get started with AI

    • Red Hat AI
      Accelerate the development and deployment of enterprise AI solutions.
    • AI learning hub
      Explore learning materials and tools, organized by task.
    • AI interactive demos
      Click through scenarios with Red Hat AI, including training LLMs and more.
    • AI/ML learning paths
      Expand your OpenShift AI knowledge using these learning resources.
    • AI quickstarts
      Focused AI use cases designed for fast deployment on Red Hat AI platforms.
    • No-cost AI training
      Foundational Red Hat AI training.

    Featured resources

    • OpenShift AI learning
    • Open source AI for developers
    • AI product application development
    • Open source-powered AI/ML for hybrid cloud
    • AI and Node.js cheat sheet

    Red Hat AI Factory with NVIDIA

    • Red Hat AI Factory with NVIDIA is a co-engineered, enterprise-grade AI solution for building, deploying, and managing AI at scale across hybrid cloud environments.
    • Explore the solution
  • Learn

    Self-guided

    • Documentation
      Find answers, get step-by-step guidance, and learn how to use Red Hat products.
    • Learning paths
      Explore curated walkthroughs for common development tasks.
    • Guided learning
      Receive custom learning paths powered by our AI assistant.
    • See all learning

    Hands-on

    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.
    • Interactive labs
      Learn by doing in these hands-on, browser-based experiences.
    • Interactive demos
      Click through product features in these guided tours.

    Browse by topic

    • AI/ML
    • Automation
    • Java
    • Kubernetes
    • Linux
    • See all topics

    Training & certifications

    • Courses and exams
    • Certifications
    • Skills assessments
    • Red Hat Academy
    • Learning subscription
    • Explore training
  • Build

    Get started

    • Red Hat build of Podman Desktop
      A downloadable, local development hub to experiment with our products and builds.
    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.

    Download products

    • Access product downloads to start building and testing right away.
    • Red Hat Enterprise Linux
    • Red Hat AI
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat Developer Toolset

    References

    • E-books
    • Documentation
    • Cheat sheets
    • Architecture center
  • Community

    Get involved

    • Events
    • Live AI events
    • Red Hat Summit
    • Red Hat Accelerators
    • Community discussions

    Follow along

    • Articles & blogs
    • Developer newsletter
    • Videos
    • Github

    Get help

    • Customer service
    • Customer support
    • Regional contacts
    • Find a partner

    Join the Red Hat Developer program

    • Download Red Hat products and project builds, access support documentation, learning content, and more.
    • Explore the benefits

Zero trust GitOps: Build a secure, secretless GitOps pipeline

March 13, 2026
Nick Png
Related topics:
GitOps
Related products:
Red Hat OpenShift GitOpsRed Hat OpenShift

    Within the Red Hat OpenShift ecosystem, OpenShift GitOps is one of the most popular Day 2 operators. This powerful tool enables teams to leverage Git as a single source of truth for the management of cluster configurations and application deployment. Built around Argo CD, OpenShift GitOps automates deployments and infrastructure updates using declarative, predictable workflows, ensuring consistency, reducing errors, and maintaining desired state across multi-cluster environments. We introduced the General Availability of the external secrets operator (ESO) for OpenShift in November of 2025, which brings the ability to securely manage credentials sourced from an external secrets management repository (such as HashiCorp Vault) within an OpenShift cluster.

    Now we're introducing a critical integration between OpenShift GitOps and the external secrets operator for OpenShift that makes OpenShift GitOps use short-lived tokens to authenticate access to your repositories rather than traditional user access tokens.

    Why do short-lived tokens matter?

    In the world of zero trust, there are two critical principles that must always be observed:

    • All workloads must have their own identity
    • Identity must come with the least possible privileges to complete the task

    Short-lived tokens play a key role in upholding these two principles. A user, or workload, is required to authenticate against an identity provider in order to obtain a token that can be used to access secure data or applications. The token that's returned contains encoded permissions matching the permissions of the user and an expiration date, meaning the token becomes invalid after a certain period of time.

    Figure 1 illustrates the potential magnitude (or "blast radius") of a breach across two dimensions: Time and severity. In the event of a breach, the best case scenario is that the compromised credential has both a short life and minimal permissions. By comparison, the worst case scenario is that the compromised credential never expires and gives the attacker complete administrator-level access to the environment.

    Impact quadrant of a potential breach, with a perpetual password and full cluster admin permissions ranking as the greatest impact for the longest time. Short-lived tokens with identity-based permissions ranks as low impact for a short amount of time.
    Figure 1: Impact quadrant of a potential breach, with a perpetual password and full cluster admin permissions ranking as the greatest impact for the longest time. Short-lived tokens with identity-based permissions ranks as low impact for a short amount of time.

    Usage of short-lived tokens can improve security over both dimensions. Firstly, it minimizes the blast radius of a breach. One of the core values of short lived tokens is their namesake. These credentials, by default, only live for a short period of time. This means that even if a bad actor were to obtain the token, they would only be able to use it for a short period of time. As a bonus, short lifespans for credentials force the actual users to refresh their tokens at periodic intervals, allowing the identity provider to re-validate that the user is indeed who they say they are.

    In terms of breach severity, tokens are direct representations of the permissions of the user. No more, and no less. This helps mitigate the impact of a breach because the compromised credential only contain a limited set of permissions scoped to the specific user.

    Finally, one of the most common challenges adopting short-lived tokens is that support for each specific system needs to be implemented in the requesting application. This adds to the software maintenance burden by adding additional dependencies and can quickly become complex whenever multiple systems require authentication with short-lived tokens. This is a problem that can be solved with the external secrets operator and its Generator feature which is flexible enough to connect to almost any provider with little to no additional code

    Generating credentials on demand

    The primary function of the external secrets operator is to automate management of credentials on Kubernetes clusters, but one of the less widely known features of ESO is the generators feature. The ESO API enables developers and vendors to define custom resources that produce new values based on pre-defined inputs. In combination with the basic functionality of ESO, this becomes an extremely powerful tool.

    For example, the traditional GitOps workflow for GitHub private repositories uses a credential like a user and password pair or an access token. These credentials are stored as a Kubernetes Secret, labeled properly so that GitOps can find the secret. The issue here is that these credentials have no built in expiry period and are very difficult to track if leaked. This is where the external secrets operator Generators come into play, and the best part is that your standard GitOps workflow remains completely unchanged.

    Available today is the GitHub Access Token Generator. This generator takes the form of a new CRD (GithubAccessToken). This CRD requires certain pieces of information, which it can then use to create an access token with a maximum lifespan of 60 minutes, and once generated, ESO takes the token and places it into a GitOps visible secret. You can even specify exactly which repositories to provide access to, as well as read/write permission.

    With this generator in place, you can take precise control over the credentials passed to GitOps. It becomes significantly easier to create multiple automatically managed tokens, each authenticating GitOps to access a different repository or even branch within a repository, because you no longer need to manage deletion or rotation of these credentials. New tokens are automatically re-issued periodically and old tokens expire automatically.

    Putting all the pieces together

    Figure 2 is a high-level architecture diagram that highlights the key pieces that must be deployed to OpenShift or externally, and how they interact in order. Assuming a GitHub PEM signing key stored in an external secrets database:

    1. SecretStore instructs ESO on how to connect
    2. ExternalSecret instructs ESO on what to get
    3. The GitHub PEM key is stored in OpenShift to sign token requests
    4. Generator uses PEM key to request a short-lived token
    5. Short-lived token is passed to ExternalSecret for storage
    6. Short-lived token is placed in a Secret that GitOps can see

    The end result is that GitOps is able to use the short-lived token to access the Git repository.

    A diagram illustrating the lifecycle of a short-lived token for GitOps as implemented in ESO.
    Figure 2: A diagram illustrating the lifecycle of a short-lived token for GitOps as implemented in ESO.

    Prerequisites

    There are a few prerequisites that are not part of OpenShift. First, you must create a private GitHub repository, and install a GitHub App to that repository. Optionally, you can also have a running external secrets repository, such as HashiCorp Vault.

    Once you have a GitHub app, you must know the app ID, installation ID, and have a generated PEM signing key. We recommend storing the PEM key in the external secrets repository and using ESO to pull the secret into OpenShift. This is more secure than placing the PEM key in OpenShift as a Kubernetes Secret.

    The external secrets operator

    Fortunately, the OpenShift steps are fairly straightforward. First, make sure that OpenShift GitOps operator and external secrets operator are installed.

    Next, for the example displayed above, deploy these four ESO custom resources:

    • SecretStore: Instructs the operator on how to communicate with your external secrets repository of choice.
    • ExternalSecret: The first ExternalSecret tells the operator how to get the GitHub PEM key stored in the secrets repository, and where to place it in OpenShift.
    • Generator: Communicates with GitHub using the App ID, Installation ID, and the GitHub PEM key to request a new token from GitHub.
    • ExternalSecret: The second ExternalSecret obtains the token from the generator and places it into a Secret that GitOps can recognize and use.

    For the purposes of this blog, we recommend you set the second ExternalSecret spec.refreshInterval to a very small value (for example, 5 minutes). This can help to demonstrate that the GitHub Authentication Token is renewed.

    With the ESO Custom Resources installed, you can view a secret containing a GitHub authentication token, and if you check back after the expected refreshInterval, then you can confirm that the token has been renewed and is different from the previous one.

    The impact on GitOps

    In the example shown above, GitOps requires two things:

    1. A properly formatted secret that the GitOps operator can see
    2. A deployed GitOps Application Custom Resource that points the operator to the private repository created as part of the prerequisites

    In fact, the first part is already done. ESO handles the creation of the secret with advanced templating and can ensure that the correct labels and annotations are assigned to the secret so that GitOps can find the credential automatically.

    To verify that GitOps is correctly configured, log into the GitOps web interface and click Settings, and then select Repositories. You see a successful connection to the private repository you're using displayed in the browser.

    Finally, deploy the GitOps Application. The token generated by ESO is used by GitOps to authenticate access to the private GitHub repository, and GitOps successfully deploys the code within the repository, and you can see a successfully "synced" application in the GitOps web interface, as well as successfully deployed Kubernetes resources based on the manifests included in the repository.

    Try out Generators for yourself

    To wrap this up, if you are interested in seeing the whole process in action, take a look at this Arcade demo:

    It walk you through the step-by-step process of deploying each of the required OpenShift resources. Additionally, there are sample .yaml files in this GitHub repository.

    The external secrets operator can be an extremely powerful tool for a Kubernetes administrator when attempting to improve the security posture of their clusters. There are many generators, including ones for short-lived Quay tokens and AWS STS tokens. These features enable administrators to not only automate the management of traditional "long-lived" passwords and credentials on OpenShift clusters with ESO, but also dynamically create tokens for workloads as needed. If you find this topic interesting and would like to apply this yourself, check out the documentation for ESO and get started with OpenShift today.

    Recent Posts

    • Trusted software factory: Building trust in the agentic AI era

    • Build a zero trust AI pipeline with OpenShift and RHEL CVMs

    • Red Hat Hardened Images: Top 5 benefits for software developers

    • How EvalHub manages two-layer Kubernetes control planes

    • Tekton joins the CNCF as an incubating project

    What’s up next?

    Gitops Cookbook e-book tile card

    GitOps Cookbook: Kubernetes Automation in Practice

    Alex Soto Bueno +1
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Chat Support

    Please log in with your Red Hat account to access chat support.