Skip to main content
Redhat Developers  Logo
  • AI

    Get started with AI

    • Red Hat AI
      Accelerate the development and deployment of enterprise AI solutions.
    • AI learning hub
      Explore learning materials and tools, organized by task.
    • AI interactive demos
      Click through scenarios with Red Hat AI, including training LLMs and more.
    • AI/ML learning paths
      Expand your OpenShift AI knowledge using these learning resources.
    • AI quickstarts
      Focused AI use cases designed for fast deployment on Red Hat AI platforms.
    • No-cost AI training
      Foundational Red Hat AI training.

    Featured resources

    • OpenShift AI learning
    • Open source AI for developers
    • AI product application development
    • Open source-powered AI/ML for hybrid cloud
    • AI and Node.js cheat sheet

    Red Hat AI Factory with NVIDIA

    • Red Hat AI Factory with NVIDIA is a co-engineered, enterprise-grade AI solution for building, deploying, and managing AI at scale across hybrid cloud environments.
    • Explore the solution
  • Learn

    Self-guided

    • Documentation
      Find answers, get step-by-step guidance, and learn how to use Red Hat products.
    • Learning paths
      Explore curated walkthroughs for common development tasks.
    • Guided learning
      Receive custom learning paths powered by our AI assistant.
    • See all learning

    Hands-on

    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.
    • Interactive labs
      Learn by doing in these hands-on, browser-based experiences.
    • Interactive demos
      Click through product features in these guided tours.

    Browse by topic

    • AI/ML
    • Automation
    • Java
    • Kubernetes
    • Linux
    • See all topics

    Training & certifications

    • Courses and exams
    • Certifications
    • Skills assessments
    • Red Hat Academy
    • Learning subscription
    • Explore training
  • Build

    Get started

    • Red Hat build of Podman Desktop
      A downloadable, local development hub to experiment with our products and builds.
    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.

    Download products

    • Access product downloads to start building and testing right away.
    • Red Hat Enterprise Linux
    • Red Hat AI
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat Developer Toolset

    References

    • E-books
    • Documentation
    • Cheat sheets
    • Architecture center
  • Community

    Get involved

    • Events
    • Live AI events
    • Red Hat Summit
    • Red Hat Accelerators
    • Community discussions

    Follow along

    • Articles & blogs
    • Developer newsletter
    • Videos
    • Github

    Get help

    • Customer service
    • Customer support
    • Regional contacts
    • Find a partner

    Join the Red Hat Developer program

    • Download Red Hat products and project builds, access support documentation, learning content, and more.
    • Explore the benefits

How to set up an MCP server for Red Hat Lightspeed

Move from dashboards to dialogue

December 1, 2025
Samiksha Saxena
Related topics:
Artificial intelligenceData integration
Related products:
Red Hat Lightspeed

    Red Hat Lightspeed (formerly known as Red Hat Insights) delivers proactive analytics and helps teams improve reliability, streamline operations, and reduce manual analysis. The Model Context Protocol (MCP) is the new foundational technology that introduces a conversational layer, allowing an LLM (AI) to interact with your services using natural language. This means less time spent digging through dashboards and more time getting direct, actionable answers.

    Imagine you received a critical alert for a CVE. Instead of frantically searching through multiple dashboards and documentation, you can simply ask insights-mcp, "What systems are affected by this CVE?" Red Hat Lightspeed MCP allows the LLM to access your Red Hat Lightspeed data to instantly provide the diagnostic information and a specific remediation playbook, all through a simple natural language prompt. This immediate, actionable intelligence is why Lightspeed MCP is a game-changer for enhancing incident response and keeping your operations running smoothly.

    This guide outlines how you can quickly get started with the insights-mcp service by setting up credentials, deploying the server, and connecting your client tools.

    Set up credentials

    The insights-mcp server acts as a standardized bridge (MCP) between an AI client (like VS Code, Cursor, or Claude Desktop) and the Red Hat Lightspeed APIs. 

    Installation requires setting up a service account and running the containerized server.

    Service account setup:

    Go to the Red Hat Hybrid Cloud Console → Click Settings (⚙️ gear icon) → Service Accounts Create a service account and retain Client ID and Client secret for later. See below in the integration instructions, where they are respectively referred to as INSIGHTS_CLIENT_ID and INSIGHTS_CLIENT_SECRET.

    Next, you need to assign roles. Service accounts allow you to scope the privileges granted to your LLM, ensuring the AI can only access the Red Hat Lightspeed APIs necessary for its tasks. By carefully selecting roles, you maintain control over data access and overall security. 

    Different toolsets require specific roles for your service account:

    • Advisor tools: RHEL Advisor viewer
    • Inventory tools: Inventory Hosts viewer
    • Vulnerability tools: Vulnerability viewer and Inventory Hosts viewer
    • Remediation tools: Remediations user

    By default, service accounts have no access. To grant permissions to service accounts, the user with user access administrator role must assign permissions. For detailed step-by-step instructions, watch this video tutorial: Service Account Permissions Setup. 

    Here are the steps to follow:

    1. Log in as organization administrator with user access administrator role.
    2. Navigate to User Access Settings: Click Settings (⚙️ Gear Icon) → User Access → Groups.
    3. Assign permissions (choose one option):
      • Option A - Create new group:
        • Create a new group (e.g., mcp-service-accounts).
        • Add required roles (e.g., RHEL Advisor viewer, Inventory Hosts viewer, etc.).
        • Add your service account to this group.
      • Option B - Use existing group:
        • Open the existing group with the necessary role.
        • Now go to the Service accounts tab and add your service account to the group. 

    Your service account will inherit all roles from the assigned group.

    Integrations

    The easiest way to deploy the insights-mcp server is using a container runtime like Podman or Docker. 

    Make sure you have Podman installed. You can install Podman using this command:

    On Fedora/RHEL/CentOS:

    sudo dnf install podman

    On macOS, use either Podman Desktop or:

    brew install podman

    Note that if you use Podman on macOS, you sometimes need to set the path to podmanexplicitly, or replace podman with the full path, as follows:

    • /usr/local/bin/podman
    • /opt/homebrew/bin/podman

    You can find the path by running which podman in your terminal.

    Connect your client

    Red Hat Lightspeed MCP can work with a client of your choice, such as VSCode, Claude, or Gemini CLI. You can find the instructions to connect Red Hat Lightspeed MCP with your choice of client in GitHub README.md.

    Security remarks

    If you start this MCP server locally (with podman or docker) make sure the container is not exposed to the internet. In this scenario, it's probably fine to use INSIGHTS_CLIENT_ID and INSIGHTS_CLIENT_SECRET, although your MCP client (e.g., VSCode, Cursor, etc.) can get your INSIGHTS_CLIENT_ID and INSIGHTS_CLIENT_SECRET.

    For a deployment where you connect to this MCP server from a different machine, you should consider that INSIGHTS_CLIENT_ID and INSIGHTS_CLIENT_SECRET are transferred to the MCP server, and you are trusting the remote MCP server not to leak them.

    In both cases, if you are in doubt, please disable/remove the INSIGHTS_CLIENT_ID and INSIGHTS_CLIENT_SECRET from your account after you are done using the MCP server.

    Join us and share your feedback

    Now is a great time to test, experiment, and provide feedback when you connect existing Red Hat Lightspeed MCP with your LLMs. Whether you're exploring automation, enhancing incident processes, or building intelligent dashboards, this preview places powerful Red Hat Lightspeed capabilities at your LLM-driven fingertips.

    This release offers early access to powerful MCP-driven workflows with Red Hat Lightspeed. We strongly encourage your feedback—including bug reports, requests for additional toolsets, and enhancement ideas—through the Red Hat Issue Router and by contributing to our GitHub repository. Your input will directly refine and shape the future of Red Hat Lightspeed MCP.

    Last updated: February 26, 2026

    Related Posts

    • How I used Red Hat Lightspeed image builder to create CIS (and more) compliant images

    • How Red Hat Lightspeed events enhance system life cycle management

    • How to use content templates in Red Hat Lightspeed

    • Testing frameworks for images built via Red Hat Lightspeed image builder

    • Detect network issues in Open vSwitch using Red Hat Lightspeed

    Recent Posts

    • Red Hat Hardened Images: Top 5 benefits for software developers

    • How EvalHub manages two-layer Kubernetes control planes

    • Tekton joins the CNCF as an incubating project

    • Federated identity across the hybrid cloud using zero trust workload identity manager

    • Confidential virtual machine storage attack scenarios

    What’s up next?

    Rede Hat Lightspeed API cheat sheet tile card

    Red Hat Lightspeed API cheat sheet

    Jerome Marc
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Chat Support

    Please log in with your Red Hat account to access chat support.