Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • See all Red Hat products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Red Hat OpenShift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • See all technologies
    • Programming languages & frameworks

      • Java
      • Python
      • JavaScript
    • System design & architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer experience

      • Productivity
      • Tools
      • GitOps
    • Automated data processing

      • AI/ML
      • Data science
      • Apache Kafka on Kubernetes
    • Platform engineering

      • DevOps
      • DevSecOps
      • Red Hat Ansible Automation Platform for applications and services
    • Secure development & architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & cloud native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • See all learning resources

    E-books

    • GitOps cookbook
    • Podman in action
    • Kubernetes operators
    • The path to GitOps
    • See all e-books

    Cheat sheets

    • Linux commands
    • Bash commands
    • Git
    • systemd commands
    • See all cheat sheets

    Documentation

    • Product documentation
    • API catalog
    • Legacy documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore the Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Integrate a custom AI service with Red Hat Ansible Lightspeed

December 10, 2025
Riya Sharma Elijah DeLee
Related topics:
Artificial intelligenceAutomation and managementOperators
Related products:
Red Hat Ansible Automation PlatformRed Hat Ansible Lightspeed with IBM watsonx Code Assistant

    Red Hat Ansible Lightspeed is the generative AI service for Red Hat Ansible Automation Platform that helps your automation team build content more efficiently.

    The Ansible Lightspeed intelligent assistant lets you bring your own AI service to power the inference engine that helps generate answers. These answers use enhanced context from retrieval-augmented generation (RAG) requests. This blog post shows you how to integrate a custom AI service to drive the inference process and get the most out of Ansible Lightspeed.

    Prerequisites

    Before you begin, you must install Red Hat OpenShift AI and deploy the inference service.

    Also ensure you have a valid Ansible Automation Platform license or subscription.

    Create a secret chatbot configuration

    1. From the OpenShift homepage, go to Workloads→ Secrets.
    2. Select Create→ Key/Value secret and add 3 key/value pairs:
      1. chatbot_model: The LLM model name that is configured on your LLM setup.
      2. chatbot_token: The API token or the API key (see Figure 1). This token is sent along with the authorization header when an inference API is called.
      3. chatbot_url: The inference API base URL on your LLM setup (for example, https://your_inference_api:8080/v1).

    Note

    Be sure to include the correct port number and /v1 at the end of your URL. To find the port number, go to Networking → Services and find the service with the same name as your chatbot_model. Look for the Service port mapping tab and add the TCP port to the end of your URL.

    Install and set up the Ansible Automation Platform Operator

    Follow these steps to install and configure the Ansible Automation Platform Operator:

    1. From the OpenShift homepage, go to Operators → OperatorHub and search for Ansible Automation Platform (Figure 2).

      OperatorHub tile for Ansible Automation Platform provided by Red Hat.
      Figure 2: The Ansible Automation Platform Operator.
    2. Click Install (Figure 3).

      Ansible Automation Platform operator details page showing the Install button.
      Figure 3: Install the Ansible Automation Platform Operator.
    3. Confirm that the Operator is installed. The status should be Succeeded (Figure 4).

      Ansible Automation Platform listed in the Installed Operators table with a Status of Succeeded.
      Figure 4: Installed Operator.
    4. Go to the installed Ansible Automation Platform Operator.
    5. Select the Ansible Automation Platform tab.
    6. Click Create AnsibleAutomationPlatform.
    7. Select the YAML view option. Add the following YAML content with your application name, namespace, and secret chatbot configuration file name, then click Create.

      apiVersion: aap.ansible.com/v1alpha1
      kind: AnsibleAutomationPlatform
      metadata:
        name: <Insert your name here>
        namespace: <Insert your namespace here>
      spec:
        controller:
          disabled: false
        eda:
          disabled: true
        hub:
          disabled: true
        lightspeed:
          chatbot_config_secret_name: <Insert your secret chatbot configuration file name>
          disabled: false
        no_log: false
        redis_mode: standalone
        route_tls_termination_mechanism: Edge
    8. Scroll to the Ansible Automation Platform tab and then select the Ansible Automation Platform Operator you just created. Wait for the status to show Running, Successful (Figure 5).
    AnsibleAutomationPlatforms list showing the myaap instance with status Conditions: Running, Successful.
    Figure 5: The Ansible Automation Platform Operator set up successfully.

    Access the Ansible chatbot

    Now that you have configured the chatbot for the Ansible Automation Platform instance, you can access the Ansible Automation Platform dashboard and start using it.

    1. From the OpenShift homepage, go to Workloads→ Secrets and click myaap-admin-password under your namespace.
    2. In the Data section, select Reveal values to get the admin password. Save this password.
    3. From the OpenShift homepage, go to Networking→ Routes.
    4. Click the location for myaap (https://myaap-chatbot-test.apps-crc.testing/), as shown in Figure 6.

      Routes table showing the myaap route with status Accepted and the Location URL.
      Figure 6: Getting the chatbot URL.
    5. Log in using the admin as the username and use the saved text as the password (Figure 7).

      Ansible Automation Platform login screen with admin entered in the Username field.
      Figure 7: Ansible Automation Platform login page.
    6. Welcome to the Ansible Automation Platform dashboard. Click the chat icon in the top right corner. The chatbot will appear on the right side (Figure 8).

      Ansible Lightspeed Intelligent Assistant panel displaying a conversation history and a text input field.
      Figure 8: Initiating the Ansible Lightspeed chatbot.
    7. Enter queries about Ansible to resolve your issues.

    Summary

    In this final article of the series, we deployed the Ansible Lightspeed chatbot with a custom inference service.

    Review the previous blogs in this series:

    • How to enable Ansible Lightspeed intelligent assistant
    • Deploy an LLM inference service on OpenShift AI

    Further resources:

    • Blog: How to run vLLM on CPUs with OpenShift for GPU-free inference
    • Course: Developing and Deploying AI/ML Applications on Red Hat OpenShift AI
    • Arcade demo: Configuring Ansible Lightspeed intelligent assistant with Red Hat AI Inference Server on RHEL
    • Product page: Ansible Lightspeed

    Related Posts

    • What's new in Ansible Automation Platform 2.6

    • Migrating Red Hat Ansible Automation Platform: From RPM to container on Red Hat Enterprise Linux

    • How to enable Ansible Lightspeed intelligent assistant

    • Migrating Ansible Automation Platform 2.4 to 2.5

    • How I used Red Hat Lightspeed image builder to create CIS (and more) compliant images

    • 3 ways Ansible Lightspeed simplifies automation

    Recent Posts

    • Integrate a custom AI service with Red Hat Ansible Lightspeed

    • Automate AI workflows with Red Hat Ansible Certified Content Collection amazon.ai for generative AI

    • Integrate OpenShift Gateway API with OpenShift Service Mesh

    • Your AI agents, evolved: Modernize Llama Stack agents by migrating to the Responses API

    • Semantic anomaly detection in log files with Cordon

    What’s up next?

    Download the Red Hat Certified Engineer (RHCE) Ansible Automation Study Guide and learn to build and operate scalable IT automation across cloud, hybrid, and edge environments.

    Get the e-book
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue