Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • View All Red Hat Products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Secure Development & Architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • Product Documentation
    • API Catalog
    • Legacy Documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

How to enable Ansible Lightspeed intelligent assistant

Part 1: Installing OpenShift AI as a self-managed software product on a self-managed OpenShift cluster

September 16, 2025
Riya Sharma Elijah DeLee
Related topics:
Artificial intelligenceAutomation and managementServerless
Related products:
Red Hat AIRed Hat OpenShift AIRed Hat Ansible Automation PlatformRed Hat OpenShift Container PlatformRed Hat OpenShift ServerlessRed Hat OpenShift Service Mesh

Share:

    In Red Hat Ansible Automation Platform 2.6, you can deploy an integrated chatbot with embedded knowledge of the Ansible Automation Platform docs. This is called the Red Hat Ansible Lightspeed intelligent assistant. The chatbot can leverage a number of inference back ends. In this article series, we will explore deploying the inference service with Red Hat OpenShift AI on the same Red Hat OpenShift Container Platform cluster as Ansible Automation Platform.

    OpenShift AI is an end-to-end platform built on Red Hat OpenShift, designed to streamline the entire lifecycle of AI/ML models, including large language models (LLMs).

    In this first article of the series, we'll begin with a step-by-step guide to installing OpenShift AI as a self-managed software product on a self-managed OpenShift cluster. In subsequent blog posts, we will create an inference service on OpenShift AI to provide inference to an Ansible Automation Platform Lightspeed chatbot in the same OpenShift cluster.

    This 3-part series will walk you through the following topics:

    • Installing OpenShift AI.
    • Deploying an inference service with OpenShift AI.
    • Configuring Ansible Lightspeed to use your inference service.

    OpenShift AI platform for LLMs

    OpenShift AI, built on Red Hat OpenShift, provides an end-to-end platform designed to streamline the entire lifecycle of AI/ML models, including LLMs. It offers a suite of tools and capabilities that address the deployment challenges head-on:

    • Kubernetes-native scalability: Leveraging the power of Kubernetes, OpenShift AI provides inherent scalability, allowing you to easily provision and de-provision resources based on demand. This is particularly beneficial for managing the fluctuating computational needs of LLMs.
    • Integrated MLOps toolchain: OpenShift AI integrates various MLOps tools, offering capabilities for data preparation, model training, version control, model serving, and monitoring. This cohesive environment simplifies the operational aspects of LLM deployment.
    • GPU acceleration: With strong support for GPU-enabled infrastructure, OpenShift AI ensures that LLMs can run efficiently, leveraging the specialized hardware they require.

    Prerequisites

    Ensure that you have the following prerequisites:

    • An OpenShift cluster with a valid OpenShift license or trial. I don’t recommend using minimum-size OpenShift Container Platform master nodes, as we observed high CPU utilization on 4-CPU master nodes, which led to Kubernetes API availability problems. I recommend the 16 CPU master nodes.
    • Cluster administrator privileges.
    • A valid OpenShift AI license or trial.
    • A valid Ansible Automation Platform license or subscription.

    Notes and recommendations

    Before we move ahead with the installations, it's important to keep the following in mind:

    • Follow the installation instructions in the given order; going out of order can cause issues. The OpenShift AI operator looks for the Custom Resources defined by the operators it depends on and will not ever get to a ready state without them being present. It can degrade the health of the master nodes when the OpenShift AI operator is constantly looking for these custom resources.
    • It is very important to install the versions provided in the installation instructions. Other versions are not compatible. Install Red Hat OpenShift Service Mesh 2, not Red Hat OpenShift Service Mesh 3.

    Install OpenShift AI

    Follow these simplified steps to install the OpenShift AI:

    1. Install OpenShift Service Mesh by going to Operators → OperatorHub. Search for OpenShift Service Mesh 2 (Figure 1).

      A screenshot of search OpenShift Service Mesh 2.
      Figure 1: Installing this operator provisions the necessary custom resources (CRs).
    2. Install Red Hat OpenShift Serverless by going to Operators → OperatorHub. Search for OpenShift Serverless (Figure 2).

      A screenshot of OpenShift Serverless installing.
      Figure 2: Installing the OpenShift Serverless operator provisions the necessary custom resources.
    3. Install OpenShift AI by going to Operators → OperatorHub. Search for OpenShift AI (Figure 3).

      A screenshot of OpenShift AI operators installing.
      Figure 3: Installing OpenShift AI operators.
    4. Validate the default-dsci object YAML file in the DSC Initialization tab of the OpenShift AI operator.
      1. From the OpenShift homepage, go to Operators → Installed Operators. Click the OpenShift AI operator.
      2. In OpenShift AI, click the DSC Initialization tab.
      3. Click default-dsci → YAML tab.
      4. In the spec section, validate that the value of the managementState field for the serviceMesh component is set to Managed, as follows:

        spec:
         applicationsNamespace: redhat-ods-applications
         monitoring:
           managementState: Managed
           namespace: redhat-ods-monitoring
         serviceMesh:
           controlPlane:
             metricsCollection: Istio
             name: data-science-smcp
             namespace: istio-system
           managementState: Managed
    5. Configure the default-dsc object YAML file in the Data Science Cluster OpenShift AI operator tab.
      1. From the OpenShift homepage, go to Operators → Installed Operators and select the OpenShift AI Operator.
      2. In OpenShift AI, select the Data Science Cluster tab.
      3. Select the default-dsc → YAML tab.
      4. In the spec.components section, configure the kserve component as follows:

        spec:
         components:
           kserve:
             managementState: Managed
             defaultDeploymentMode: Serverless 
             RawDeploymentServiceConfig: Headed 
             serving:
               ingressGateway:
                 certificate:
                   secretName: knative-serving-cert 
                   type: OpenshiftDefaultIngress 
               managementState: Ma
    6. Verify the KServe service:

      1. Go to Workloads → Pods.
      2. From the project list, select redhat-ods-applications. This is the project in which OpenShift AI components are installed, including KServe.
      3. Confirm that you see a running pod for the KServe controller manager, similar to the following:
      kserve-controller-manager-7fbb7bccd4-t4c5g    1/1       Running  
      odh-model-controller-6c4759cc9b-cftmk         1/1       Running    
      odh-model-controller-6c4759cc9b-ngj8b         1/1       Running   
      odh-model-controller-6c4759cc9b-vnhq5         1/1       Running 
    7. Verify OpenShift Service Mesh:

      1. Go to Workloads → Pods.
      2. From the project list, select istio-system. This is the project in which OpenShift Service Mesh is installed.

      Confirm that there are running pods for the service mesh control plane, ingress gateway, and egress gateway, similar to the following:

      istio-egressgateway-7c46668687-fzsqj      	 	  1/1       Running       
      istio-ingressgateway-77f94d8f85-fhsp9      		  1/1       Running         
      istiod-data-science-smcp-cc8cfd9b8-2rkg4  		  1/1       Running
    8. Verify OpenShift Serverless:
      1. Go to Workloads → Pods.
      2. From the project list, select knative-serving. This is the project in which OpenShift Serverless is installed.
      3. Confirm that the knative-serving project contains numerous running pods. These include activator, autoscaler, controller, and domain mapping pods, as well as pods for the Knative Istio controller. The Knative Istio controller manages the integration of OpenShift Serverless and OpenShift Service Mesh. You should see pods similar to the following:

        activator-7586f6f744-nvdlb               	2/2       Running
        activator-7586f6f744-sd77w               	2/2       Running
        autoscaler-764fdf5d45-p2v98             	2/2       Running
        autoscaler-764fdf5d45-x7dc6              	2/2       Running
        autoscaler-hpa-7c7c4cd96d-2lkzg          	1/1       Running
        autoscaler-hpa-7c7c4cd96d-gks9j         	1/1       Running
        controller-5fdfc9567c-6cj9d              	1/1       Running
        controller-5fdfc9567c-bf5x7              	1/1       Running
        domain-mapping-56ccd85968-2hjvp          	1/1       Running
        domain-mapping-56ccd85968-lg6mw          	1/1       Running
        domainmapping-webhook-769b88695c-gp2hk   	1/1       Running
        domainmapping-webhook-769b88695c-npn8g   	1/1       Running
        net-istio-controller-7dfc6f668c-jb4xk    	1/1       Running
        net-istio-controller-7dfc6f668c-jxs5p    	1/1       Running
        net-istio-webhook-66d8f75d6f-bgd5r       	1/1       Running
        net-istio-webhook-66d8f75d6f-hld75      	1/1       Running
        webhook-7d49878bc4-8xjbr                 	1/1       Running
        webhook-7d49878bc4-s4xx4                 	1/1       Running
    9. Confirm the Ready state of the clusters:
      1. From the OpenShift homepage, go to Operators → Installed Operators. Select the OpenShift AI Operator.
      2. Check that the default-dsci object under DSCInitializations has a Ready state.
      3. Check that the default-dsc object under DataScienceClusters has a Ready state.
    10. Access the OpenShift AI web console. On the right side of the OpenShift Container Platform console's top navigation bar, locate the square icon formed by nine smaller squares. Click it and select Red Hat OpenShift AI from the drop-down menu, as shown in Figure 4.

      A screenshot of the Openshift AI drop down menu.
      Figure 4: Red Hat OpenShift AI drop-down menu.
    11. Log in to Red Hat OpenShift AI in the new tab using your OpenShift credentials. Welcome to OpenShift AI (Figure 5).

      The OpenShift AI dashboard
      Figure 5: Log into the OpenShift AI dashboard.

    Next steps

    In this article, we demonstrated the steps to install OpenShift Service Mesh, OpenShift Serverless, and OpenShift AI, along with verifying their successful deployment. In the next article, we will cover how to deploy our own inference service with OpenShift AI to power the inference behind Ansible Lightspeed Intelligent Assistant.

    Explore additional resources:

    • How to run vLLM on CPUs with OpenShift for GPU-free inference
    • Red Hat courses
    • Introduction to Red Hat OpenShift AI
    Last updated: September 17, 2025

    Related Posts

    • 3 ways Ansible Lightspeed simplifies automation

    • Enhance Ansible development experience with Lightspeed

    • 4 steps to run an application under OpenShift Service Mesh

    • How OpenShift Serverless Logic evolved to improve workflows

    • What’s new in Red Hat Ansible Lightspeed with IBM watsonx Code Assistant

    • Generate Ansible Playbooks using natural language prompts

    Recent Posts

    • Skopeo: The unsung hero of Linux container-tools

    • Automate certificate management in OpenShift

    • Customize RHEL CoreOS at scale: On-cluster image mode in OpenShift

    • How to set up KServe autoscaling for vLLM with KEDA

    • How I used Cursor AI to migrate a Bash test suite to Python

    What’s up next?

    A Function as a Service (FaaS) object can be used to process data on an as-needed basis, which limits costs, since you only pay for the CPU cycles you need. In this learning path, you will learn how to implement a back-end function and a front-end web application, then use a FaaS object to process data.

    Start the activity
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue