Skip to main content
Redhat Developers  Logo
  • AI

    Get started with AI

    • Red Hat AI
      Accelerate the development and deployment of enterprise AI solutions.
    • AI learning hub
      Explore learning materials and tools, organized by task.
    • AI interactive demos
      Click through scenarios with Red Hat AI, including training LLMs and more.
    • AI/ML learning paths
      Expand your OpenShift AI knowledge using these learning resources.
    • AI quickstarts
      Focused AI use cases designed for fast deployment on Red Hat AI platforms.
    • No-cost AI training
      Foundational Red Hat AI training.

    Featured resources

    • OpenShift AI learning
    • Open source AI for developers
    • AI product application development
    • Open source-powered AI/ML for hybrid cloud
    • AI and Node.js cheat sheet

    Red Hat AI Factory with NVIDIA

    • Red Hat AI Factory with NVIDIA is a co-engineered, enterprise-grade AI solution for building, deploying, and managing AI at scale across hybrid cloud environments.
    • Explore the solution
  • Learn

    Self-guided

    • Documentation
      Find answers, get step-by-step guidance, and learn how to use Red Hat products.
    • Learning paths
      Explore curated walkthroughs for common development tasks.
    • See all learning

    Hands-on

    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.
    • Interactive labs
      Learn by doing in these hands-on, browser-based experiences.
    • Interactive demos
      Click through product features in these guided tours.

    Browse by topic

    • AI/ML
    • Automation
    • Java
    • Kubernetes
    • Linux
    • See all topics

    Training & certifications

    • Courses and exams
    • Certifications
    • Skills assessments
    • Red Hat Academy
    • Learning subscription
    • Explore training
  • Build

    Get started

    • Red Hat build of Podman Desktop
      A downloadable, local development hub to experiment with our products and builds.
    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.

    Download products

    • Access product downloads to start building and testing right away.
    • Red Hat Enterprise Linux
    • Red Hat AI
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat Developer Toolset

    References

    • E-books
    • Documentation
    • Cheat sheets
    • Architecture center
  • Community

    Get involved

    • Events
    • Live AI events
    • Red Hat Summit
    • Red Hat Accelerators
    • Community discussions

    Follow along

    • Articles & blogs
    • Developer newsletter
    • Videos
    • Github

    Get help

    • Customer service
    • Customer support
    • Regional contacts
    • Find a partner

    Join the Red Hat Developer program

    • Download Red Hat products and project builds, access support documentation, learning content, and more.
    • Explore the benefits

Transform Kiali with OpenShift Lightspeed and Kubernetes MCP

January 29, 2026
Alberto Jesus Gutiérrez Juanes
Related topics:
Artificial intelligenceKubernetesObservabilityService mesh
Related products:
Red Hat OpenShift Container PlatformRed Hat OpenShift LightspeedRed Hat OpenShift Service Mesh

    In our previous article about the Kiali toolset, we explored how Kiali acts as the "eyes" of your service mesh, providing unparalleled visibility into traffic topology, health, and metrics. We showed you how to manually inspect the graph, validate Istio configurations, and troubleshoot mTLS issues. But what if you didn't have to manually hunt for errors? What if you could just ask your cluster what’s wrong?

    In this article, we will take a leap forward by installing the Kiali Model Context Protocol (Kubernetes MCP) server in Red Hat OpenShift Lightspeed. This integration allows the OpenShift Lightspeed AI assistant to interface directly with Kiali, giving the AI visibility into your service mesh to assist with troubleshooting and configuration.

    Set up OpenShift Lightspeed

    Before we dive into the installation, ensure you have the following prerequisites:

    1. OpenShift cluster (Version 4.15+ recommended).
    2. Red Hat OpenShift Service Mesh installed and running.
    3. Kiali (or the OpenShift Service Mesh console) deployed and accessible.
    4. OpenShift Lightspeed operator installed on your cluster.

    For an enterprise-grade experience, you can integrate this toolset directly into OpenShift Lightspeed. This allows any user on the cluster to utilize Kiali's capabilities through the OpenShift Lightspeed chat interface.

    Since OpenShift Lightspeed runs inside the cluster, we need to deploy the MCP server as a service rather than running it locally on your laptop.

    Step 1: Create a ConfigMap for Kiali configuration

    Created a mcp-osl-config.toml in a known location (e.g., ~/mcp-osl-config.toml).

    toolsets = ["core","kiali"]
    read_only = true
    [toolset_configs.kiali]
    url = "https://kiali-istio-system.apps-crc.testing/"
    insecure = true

    Then, upload your TOML configuration to the cluster.

    oc create configmap kubernetes-mcp-config \
      --from-file=~/mcp-osl-config.toml=./mcp-osl-config.toml \
      -n istio-system

    Step 2: Deploy the MCP server

    Create a deployment that runs the server. This exposes the MCP over HTTP so OpenShift Lightspeed can connect to it.

    apiVersion: apps/v1
    
    kind: Deployment
    
    metadata:
    
      name: kubernetes-mcp-server
    
      namespace: istio-system
    
    spec:
    
      replicas: 1
    
      selector:
    
        matchLabels:
    
          app: kubernetes-mcp-server
    
      template:
    
        metadata:
    
          labels:
    
            app: kubernetes-mcp-server
    
        spec:
    
          serviceAccountName: kubernetes-mcp-server
    
          automountServiceAccountToken: true
    
          containers:
    
            - name: mcp-server
    
              image: quay.io/containers/kubernetes_mcp_server:latest # Check for latest version
    
              args:
    
                - "--port=8080"
    
                - "--config=/etc/mcp/mcp-viewer-config.toml"
    
              ports:
    
                - containerPort: 8080
    
              volumeMounts:
    
                - name: config-vol
    
                  mountPath: /etc/mcp
    
          volumes:
    
            - name: config-vol
    
              configMap:
    
                name: kubernetes-mcp-config
    
    ---
    
    apiVersion: v1
    
    kind: Service
    
    metadata:
    
      name: kubernetes-mcp-server
    
      namespace: istio-system
    
    spec:
    
      selector:
    
        app: kubernetes-mcp-server
    
      ports:
    
        - port: 8080
    
          targetPort: 8080

    Step 3: Register the tool with OpenShift Lightspeed

    Once the MCP server is running, OpenShift Lightspeed needs to know it exists. Depending on your version, you may need to configure the OlsConfig (OpenShift Lightspeed config) to whitelist the new tool endpoint or enable the service mesh plug-in as follows:

    1. Navigate to the Lightspeed Operator settings in the OpenShift console.
    2. Look for Tool Definitions or MCP Servers.
    3. Add the service URL of your new deployment (e.g., http://kubernetes-mcp-server-istio-system.apps-crc.testing/mcp).

    You can also modify the YAML directly (Figure 1).

    This shows the Kiali toolset OpenShift Lightspeed configuration settings.
    Figure 1: This shows the OpenShift Lightspeed configuration settings for the Kiali toolset.

    Step 4: Chat with your mesh

    Now for the magic! Open the OpenShift Lightspeed chat window in the console and try this prompt (Figure 2):

    User: "Analyze the traffic flow for the 'payments' service."

    OpenShift Lightspeed: "I checked Kiali, and the 'payments' service is receiving traffic from 'checkout' but is experiencing a 15% error rate on the v2 workload..."

    OpenShift Lightspeed talking with the MCP in the chat window.
    Figure 2: OpenShift Lightspeed talks to the MCP.

    OpenShift Lightspeed returns an explanation in Figure 3.

    OpenShift Lightspeed returns an explanation in this window.
    Figure 3: OpenShift Lightspeed returns an explanation.

    Final thoughts

    By installing the Kiali MCP integration, we've transformed Kiali from a passive dashboard into an active data source for AI-driven operations. This setup reduces the "time-to-insight" for SREs and developers by combining the comprehensive Kiali toolset with the conversational power of OpenShift Lightspeed. You can watch the demo on YouTube.

    Related Posts

    • Building effective AI agents with Model Context Protocol (MCP)

    • How to set up an MCP server for Red Hat Lightspeed

    • How to observe your multicluster service mesh with Kiali

    • Enhance Kubernetes observability: Connect AI to Istio with Kiali

    Recent Posts

    • Federated identity across the hybrid cloud using zero trust workload identity manager

    • Confidential virtual machine storage attack scenarios

    • Introducing virtualization platform autopilot

    • Integrate zero trust workload identity manager with Red Hat OpenShift GitOps

    • Best Practice Configuration and Tuning for Linux and Windows VMs

    What’s up next?

    Learning Path Get started with Red Hat OpenShift Lightspeed share image

    Get started with Red Hat OpenShift Lightspeed

    This learning exercise explains the requirements for Red Hat OpenShift...
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Chat Support

    Please log in with your Red Hat account to access chat support.