Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • See all Red Hat products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Red Hat OpenShift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • See all technologies
    • Programming languages & frameworks

      • Java
      • Python
      • JavaScript
    • System design & architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer experience

      • Productivity
      • Tools
      • GitOps
    • Automated data processing

      • AI/ML
      • Data science
      • Apache Kafka on Kubernetes
    • Platform engineering

      • DevOps
      • DevSecOps
      • Red Hat Ansible Automation Platform for applications and services
    • Secure development & architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & cloud native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • See all learning resources

    E-books

    • GitOps cookbook
    • Podman in action
    • Kubernetes operators
    • The path to GitOps
    • See all e-books

    Cheat sheets

    • Linux commands
    • Bash commands
    • Git
    • systemd commands
    • See all cheat sheets

    Documentation

    • Product documentation
    • API catalog
    • Legacy documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore the Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

How to connect OpenShift Lightspeed MCP to your IDE

February 4, 2026
Diego Alvarez Ponce
Related topics:
Artificial intelligenceDeveloper productivityDeveloper toolsGitOpsIDEs
Related products:
Red Hat OpenShift GitOpsRed Hat OpenShiftRed Hat OpenShift Container PlatformRed Hat OpenShift Lightspeed

    Modern Integrated Development Environments (IDEs) have evolved beyond simple code editors into intelligent platforms powered by AI assistants. These assistants can answer questions, generate code, and help debug issues. But they often lack deep expertise in specialized platforms like Red Hat OpenShift. This is where Model Context Protocol (MCP) comes into play. MCP is an open standard that allows AI assistants to connect with external tools, extending their capabilities far beyond their base knowledge and providing them access to specialized and real-time information.

    In this article, we'll walk through a proof of concept that connects an MCP server for Red Hat OpenShift Lightspeed to your preferred IDE. OpenShift Lightspeed, Red Hat's AI assistant, is purpose-built for OpenShift, grounded in official documentation, and capable of interacting with your cluster. By exposing OpenShift Lightspeed through an MCP server, we bring that expertise directly into your development workflow, allowing you to ask OpenShift questions, generate configurations, and even query cluster resources without leaving your editor.

    Prepare your environment

    For this article, you must have the following prerequisites:

    • A code editor with an AI assistant that supports MCP (e.g., Claude Code, Cursor, VS Code with GitHub Copilot, etc.).
    • Access to a running OpenShift cluster.
    • The OpenShift Lightspeed operator installed and connected to a supported LLM provider.
    • To query live cluster resources, enable the Cluster Interaction feature in the OLSConfig custom resource (optional). 

    For reference, my environment uses Cursor as the IDE and an OpenShift 4.20 cluster with OpenShift Lightspeed 1.0.8, configured to communicate with a GPT-4 model hosted on Azure OpenAI.

    Additionally, you'll need to clone the MCP server repository to your local machine. This repository contains all the logic and dependencies required to run the OpenShift Lightspeed MCP server:

    git clone https://github.com/thoraxe/ols-mcp.git

    Make note of the full path to this directory, because you'll need it later to configure your IDE.

    Exposing the OpenShift Lightspeed service

    In our running cluster, we need to expose the OpenShift Lightspeed service to access it from outside the cluster. Apply the following YAML to create a secure route pointing to the lightspeed-app-server service. We use Reencrypt TLS termination to ensure traffic remains encrypted outside and inside the cluster.

    apiVersion: route.openshift.io/v1
    kind: Route
    metadata:
      name: ols-route
      namespace: openshift-lightspeed
    spec:
      to:
        name: lightspeed-app-server
        weight: 100
        kind: Service
      port:
        targetPort: https
      tls:
        termination: reencrypt
      wildcardPolicy: None

    Once applied, you'll get a URL that exposes the OpenShift Lightspeed API, such as https://ols-route-openshift-lightspeed.apps.mycluster.example.com. Keep this handy for the MCP configuration.

    You'll also need an authentication token. Retrieve it by running the following command:

    oc whoami -t

    The output will look something like sha256~NW5MJOynIEABEj_7pSna....

    With both the route URL and the authentication token ready, OpenShift Lightspeed is now accessible from outside the cluster. Let's move on to configuring the IDE.

    Configure your IDE to use the MCP

    While the process for configuring MCP servers varies slightly between IDEs, the general approach is similar. I'll show you the steps we need to follow in Cursor as an example.

    Open Cursor and select any project you want to use. You'll see the AI assistant interface on the right side of your screen.

    To enable the MCP server, go to Cursor Settings > Tools & MCP > New MCP Server. This will take you to the mcp.json file used in Cursor to configure the MCP servers available to the IDE. If you haven't configured any before, you'll see an empty server list.

    Add the following configuration, adjusting the parameters to match your environment:

    {
      "mcpServers": {
        "openshift-lightspeed": {
          "command": "uv",
          "args": ["--directory", "/path/to/your/folder/ols-mcp", "run", "python", "-m", "ols_mcp.server"],
          "env": {
            "OLS_API_URL": "https://ols-route-openshift-lightspeed.apps.mycluster.example.com",
            "OLS_API_TOKEN": "sha256~NW5MJOynIEABEj_7pSna...",
            "OLS_TIMEOUT": "30.0",
            "OLS_VERIFY_SSL": "true"
          }
        }
      }
    }

    Let's break down the key parameters:

    • args: The absolute path to the cloned ols-mcp repository. This tells uv where to find the project and execute the MCP server.
    • OLS_API_URL: The route URL we created earlier to expose OpenShift Lightspeed. Replace this with your own route.
    • OLS_API_TOKEN: The bearer token obtained from the cluster. This authenticates requests to the OpenShift Lightspeed API.
    • OLS_TIMEOUT: How long (in seconds) to wait for a response before timing out. Increase this if you're on a slow network or expect complex queries.
    • OLS_VERIFY_SSL: Set to true to verify SSL certificates.

    Save the file and verify the MCP server is running correctly. In Cursor, open the terminal panel at the bottom of the window. Switch to the Output tab and select MCP: user-openshift-lightspeed from the dropdown menu. Figure 1 displays the server's initialization logs.

    The MCP logs are displayed in the Cursor console.
    Figure 1: The MCP logs displayed in the Cursor console.

    If the connection is successful, you should see a message indicating that one tool has been discovered: openshift-lightspeed. This confirms the MCP server is running and ready to receive queries from the AI assistant.

    Interacting with OpenShift Lightspeed via MCP

    Now, we can start using the AI chat interface to interact with our OpenShift Lightspeed instance. Be sure to enable the Agent mode to allow the model calling tools. The default model in Cursor is Claude.

    We can start asking a simple OpenShift question:

    How do I create a route for the lightspeed-app-server service?

    The AI assistant will likely ask for permission to use the openshift-lightspeed tool. Once you approve it, you will see a response. Here's what OpenShift Lightspeed returned (Figure 2):

    A screenshot of the response to the question.
    Figure 2: The response to the question: How do I create a route for the lightspeed-app-server service?

    In the meantime, the following has happened behind the scenes. It's worth noting that this workflow involves two AI models working in sequence.

    1. The IDE's model (Claude, in Cursor's case) receives your question and decides whether to use the openshift-lightspeed tool. It may even rephrase or refine your question before forwarding it to the MCP server.
    2. Your question is sent to the OpenShift Lightspeed service via the MCP server.
    3. OpenShift Lightspeed retrieves relevant sections from the official OpenShift documentation using Retrieval-Augmented Generation (RAG).
    4. The retrieved documentation, along with your question, is sent to the LLM configured in OpenShift Lightspeed (in our case, GPT-4).
    5. The model generates a response grounded in official OpenShift documentation.
    6. The response travels back through the MCP server to your IDE AI assistant, where Claude may reformat or summarize it before presenting it to you.

    This two-model architecture combines the strengths of both: your IDE's assistant handles the conversation flow and tool orchestration, while OpenShift Lightspeed provides OpenShift-specific expertise backed by official documentation, reducing hallucinations and providing accurate, up-to-date guidance.

    Now let's take it a step further, querying live cluster resources. This requires you to enable the Cluster Interaction feature in your OpenShift Lightspeed instance.

    Let's ask about the pods running in the cluster:

    Show me the pods running in the openshift-lightspeed namespace

    Figure 3 displays the response:

    This shows the response to the question: Show me the pods running in the openshift-lightspeed namespace.
    Figure 3: The response to the query: Show me the pods running in the openshift-lightspeed namespace.

    As before, the IDE's model decides to call the openshift-lightspeed tool. But this time, something extra happens inside OpenShift Lightspeed. When the Cluster Interaction is enabled, OpenShift Lightspeed has access to a Kubernetes MCP server running within the cluster. The model configured in our OpenShift Lightspeed instance will call the necessary tools from that MCP server to retrieve information about the requested cluster resources. The collected data is then sent to GPT-4 to compose a response, which is sent back to our IDE.

    Combining MCP servers

    One of the most powerful aspects of MCP is the ability to combine multiple servers in a single workflow. For example, you could add the GitHub MCP server alongside OpenShift Lightspeed to enable a complete GitOps pipeline directly from your IDE.

    Imagine this scenario: As an application developer, you ask the AI assistant to generate a deployment manifest for your application. OpenShift Lightspeed provides the YAML based on best practices and official documentation. Then, in the same conversation, you ask the assistant to push that manifest to your GitOps repository. The GitHub MCP server handles the commit and push. Finally, if Red Hat OpenShift GitOps is configured in the cluster, it automatically detects and synchronizes the change, deploying the application without ever leaving your IDE.

    But that was just an example. The MCP ecosystem is growing rapidly, and they are developing new servers every day. By combining them with OpenShift Lightspeed, you can build powerful workflows and find new ways to optimize your OpenShift operations. 

    Final thoughts

    This article explored a proof of concept. Please be aware that the described feature and configuration are not currently supported. The OpenShift Lightspeed team is actively seeking your feedback. Is this something you would like to see officially supported? Contact us at openshift-lightspeed-contact-requests@redhat.com. We’d love to hear your thoughts.

    Related Posts

    • OpenShift Lightspeed: Assessing AI for OpenShift operations

    • Assessing AI for OpenShift operations: Advanced configurations

    • Integrate incident detection with OpenShift Lightspeed via MCP

    • New: Local development with JetBrains IDEs in OpenShift Dev Spaces

    Recent Posts

    • How to connect OpenShift Lightspeed MCP to your IDE

    • Making LLMs boring: From chatbots to semantic processors

    • Accelerating large language models with NVFP4 quantization

    • Secure Boot certificate changes in 2026: Guidance for RHEL environments

    • Upgrade volume performance without downtime: VolumeAttributesClass on OpenShift

    What’s up next?

    Learning Path Get started with Red Hat OpenShift Lightspeed share image

    Get started with Red Hat OpenShift Lightspeed

    This learning exercise explains the requirements for Red Hat OpenShift...
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue