Skip to main content
Redhat Developers  Logo
  • AI

    Get started with AI

    • Red Hat AI
      Accelerate the development and deployment of enterprise AI solutions.
    • AI learning hub
      Explore learning materials and tools, organized by task.
    • AI interactive demos
      Click through scenarios with Red Hat AI, including training LLMs and more.
    • AI/ML learning paths
      Expand your OpenShift AI knowledge using these learning resources.
    • AI quickstarts
      Focused AI use cases designed for fast deployment on Red Hat AI platforms.
    • No-cost AI training
      Foundational Red Hat AI training.

    Featured resources

    • OpenShift AI learning
    • Open source AI for developers
    • AI product application development
    • Open source-powered AI/ML for hybrid cloud
    • AI and Node.js cheat sheet

    Red Hat AI Factory with NVIDIA

    • Red Hat AI Factory with NVIDIA is a co-engineered, enterprise-grade AI solution for building, deploying, and managing AI at scale across hybrid cloud environments.
    • Explore the solution
  • Learn

    Self-guided

    • Documentation
      Find answers, get step-by-step guidance, and learn how to use Red Hat products.
    • Learning paths
      Explore curated walkthroughs for common development tasks.
    • See all learning

    Hands-on

    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.
    • Interactive labs
      Learn by doing in these hands-on, browser-based experiences.
    • Interactive demos
      Click through product features in these guided tours.

    Browse by topic

    • AI/ML
    • Automation
    • Java
    • Kubernetes
    • Linux
    • See all topics

    Training & certifications

    • Courses and exams
    • Certifications
    • Skills assessments
    • Red Hat Academy
    • Learning subscription
    • Explore training
  • Build

    Get started

    • Red Hat build of Podman Desktop
      A downloadable, local development hub to experiment with our products and builds.
    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.

    Download products

    • Access product downloads to start building and testing right away.
    • Red Hat Enterprise Linux
    • Red Hat AI
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat Developer Toolset

    References

    • E-books
    • Documentation
    • Cheat sheets
    • Architecture center
  • Community

    Get involved

    • Events
    • Live AI events
    • Red Hat Summit
    • Red Hat Accelerators
    • Community discussions

    Follow along

    • Articles & blogs
    • Developer newsletter
    • Videos
    • Github

    Get help

    • Customer service
    • Customer support
    • Regional contacts
    • Find a partner

    Join the Red Hat Developer program

    • Download Red Hat products and project builds, access support documentation, learning content, and more.
    • Explore the benefits

Automate AI workflows with Red Hat Ansible Certified Content Collection amazon.ai for generative AI

Automation unleashed

December 10, 2025
Alina Buzachis
Related topics:
Artificial intelligenceAutomation and managementDeveloper productivityDevOps
Related products:
Red Hat Ansible Automation Platform

    In part 1 of this blog, we introduced the Red Hat Ansible Certified Content Collection amazon.ai for generative AI and how it brings declarative automation to Amazon Bedrock and DevOps Guru.

    Now it's time to move from theory to practice. In this post, we'll explore hands-on use cases that demonstrate how to automate AI workflows, from deploying Bedrock Agents to orchestrating DevOps Guru monitoring.

    If you've ever felt the pain of manually managing AI agents, configuring multiple endpoints, or pulling operational insights for audits, this post is for you. By the end, you’ll see how you can treat AI infrastructure as code, which provides repeatable, auditable, and reliable automation.

    Why automation matters in practice

    Manual AI management isn't just slow; it's error-prone:

    • Inconsistent deployments: Recreating an agent or model in a new environment might produce subtle differences, which leads to unexpected failures.
    • Error-prone configuration: Complex action groups, API schemas, and IAM roles are easy to misconfigure manually.
    • Operational blind spots: Without automated monitoring, anomalies can go undetected, and audits become difficult or incomplete.
    • Limited scalability: Repeating manual tasks across multiple agents or services quickly becomes unsustainable.

    The Red Hat Ansible Certified Content Collection amazon.ai for generative AI helps solve these challenges by providing declarative modules for Bedrock and DevOps Guru. These modules allow teams to:

    • Deploy and validate AI agents automatically.
    • Invoke foundation models programmatically.
    • Configure and audit operational monitoring at scale.
    • Generate compliance-ready reports.

    In other words, you can now treat AI and its operational ecosystem as first-class code artifacts.

    Use cases: amazon.ai in action

    The playbooks below provide robust, comprehensive automation examples built around the new Red Hat Ansible Certified Content Collection amazon.ai.

    Use case 1: End-to-end agent deployment, validation, and auditing

    Suppose you want to deploy an AI-powered IT support assistant built on Amazon Bedrock. This agent helps employees resolve help desk issues, like password resets and service status checks by calling backend AWS Lambda functions.

    Using the Red Hat Ansible Certified Content Collection for amazon.ai, you can automate the full agent lifecycle. The following playbook does the following:

    • Deploy or update the Bedrock agent with the proper foundation model and IAM role.
    • Configure the action group linked to operational Lambda functions.
    • Create an alias for integration with the chat.
    • Validate the agent's functionality through a test query.
    • Collect and log all configuration details for auditing and compliance.

    Pro tip: Optionally upload a JSON audit report to Amazon S3 for long-term traceability and governance.

    Outcome: After execution, you will have a fully deployed Bedrock IT Support agent with:

    • An active alias endpoint for user queries.
    • A Lambda-powered action group performing automated tasks.
    • Validation logs showing the agent's responses.
    • A structured audit report, optionally uploaded to S3 for compliance.
    ---
    - name: Full Agent Lifecycle - Deploy, Validate, and Audit
      hosts: localhost
      gather_facts: false
      vars:
        agent_name: "ITSupportAssistant"
        alias_name: "support-alias"
        action_group_name: "SupportTasks"
        foundation_model: "anthropic.claude-v2"
        iam_role_arn: "arn:aws:iam::123456789012:role/BedrockAgentRole"
        lambda_arn: "arn:aws:lambda:us-east-1:123456789012:function:ITSupportLambda"
        upload_audit: true  # Set to true to enable S3 upload
        audit_bucket: "it-support-audit-logs"
        current_date: "{{ lookup('pipe', 'date +%Y-%m-%d') }}"
      tasks:
        - name: Create a Bedrock Agent
          amazon.ai.bedrock_agent:
            state: present
            agent_name: "{{ agent_name }}"
            foundation_model: "{{ foundation_model }}"
            instruction: "You are an internal IT Support Assistant that helps employees with technical issues like password resets or server status checks."
            agent_resource_role_arn: "{{ iam_role_arn }}"
          register: agent_deploy
        
        - name: Configure an Action Group
          amazon.ai.bedrock_agent_action_group:
            state: present
            agent_name: "{{ agent_name }}"
            action_group_name: "{{ action_group_name }}"
            description: "Handles IT support automation tasks (password reset, system checks)."
            lambda_arn: "{{ lambda_arn }}"
            api_schema: "{{ lookup('file', 'files/api_schema.yml') }}"
          register: action_group_deploy
        - name: Create an Alias
          amazon.ai.bedrock_agent_alias:
            state: present
            agent_name: "{{ agent_name }}"
            alias_name: "{{ alias_name }}"
            description: "Endpoint for the IT Support Assistant."
          register: alias_deploy
        
        - name: Validate agent with a test query
          amazon.ai.bedrock_invoke_agent:
            agent_id: "{{ agent_deploy.agent.agent_id }}"
            agent_alias_id: "{{ alias_deploy.agent_alias.agent_alias_id }}"
            input_text: "Can you reset my password for the dev portal?"
            enable_trace: true
          register: validation_test
        
        - name: Retrieve agent configuration details
          amazon.ai.bedrock_agent_info:
            agent_name: "{{ agent_name }}"
          register: agent_info
        - name: Retrieve Action Group configuration
          amazon.ai.bedrock_agent_action_group_info:
            agent_name: "{{ agent_name }}"
            action_group_name: "SupportTasks"
          register: action_group_info
        - name: List All Aliases for the Agent
          amazon.ai.bedrock_agent_alias_info:
            agent_name: "{{ agent_name }}"
          register: aliases_list
        - name: Render audit report from template
          ansible.builtin.template:
            src: "templates/audit_report.json.j2"
            dest: "/tmp/audit_report.json"
        - name: Optionally upload report to S3 for audit trail
          amazon.aws.s3_object:
            bucket: "{{ audit_bucket }}"
            object: "reports/audit_{{ current_date) }}.json"
            mode: put
            src: "/tmp/audit_report.json"
          when: upload_audit | default(false)
       
        # Final console summary
        - name: Final audit and validation summary
          ansible.builtin.debug:
            msg: |
              === IT Support Assistant Deployment Summary ===
              Agent: {{ agent_name }} ({{ agent_deploy.agent.agent_id | default('N/A') }})
              Alias: {{ alias_name }} ({{ alias_deploy.agent_alias.agent_alias_id | default('N/A') }})
              Model: {{ agent_info.agents.0.foundation_model | default('unknown') }}
              Validation: {{ validation_test.response_text | truncate(100) }}
              Total Aliases: {{ aliases_list.agent_aliases | length }}
              Audit Uploaded: {{ upload_audit }}
              =================================================
    

    This workflow ensures the agent is ready for production, fully validated, and auditable. Operations teams can confidently deploy agents at scale with consistent results.

    Use case 2: Personalized content generation

    Suppose you want to dynamically generate personalized email content for customers based on recent behavior, preferences, or purchase history.

    Pro tip: The generated content can optionally be stored in Amazon S3 for auditing or reuse, or sent to a ServiceNow ticket for review.

    Outcome: After execution:

    • A Bedrock model is automatically selected and invoked.
    • A fully generated personalized message is returned and stored in the generated_message variable.
    • This message can be logged, reviewed, sent, or processed downstream for marketing or operational workflows.
    ---
    - name: Personalized Content Generation
      hosts: localhost
      connection: local
      gather_facts: false
      vars:
        prompt_text: "Generate a personalized marketing email for a customer who purchased a smartwatch last week."
      tasks:
        - name: List available text generation models
          amazon.ai.bedrock_foundation_models_info:
            by_output_modality: 'TEXT'
          register: image_models
        - name: Select an on-demand compatible image model
          set_fact:
            chosen_text_model: >-
              {{ (image_models.foundation_models 
                | selectattr('inference_types_supported', 'defined')
                | selectattr('inference_types_supported', 'contains', 'ON_DEMAND')
                | map(attribute='model_id')
                | first) }}
        - name: Inspect the selected model
          amazon.ai.bedrock_foundation_models_info:
            model_id: "{{ chosen_text_model }}"
          register: model_details
        
        - name: Build payload for content generation
          set_fact:
            text_payload:
              messages:
                - role: "user"
                  content: "{{ prompt_text }}"
              max_tokens: 500
              temperature: 0.7
              top_p: 0.9
        
        - name: Generate personalized content
          amazon.ai.bedrock_invoke_model:
            model_id: "{{ chosen_text_model }}"
            body: "{{ text_payload }}"
            content_type: "application/json"
            accept: "application/json"
          register: model_response
        
        - name: Extract generated message
          set_fact:
            generated_message: >-
              {{ model_response.response.body.output_text
                 | default(model_response.response.body.completion)
                 | default(model_response.response.body.message)
                 | default('No output returned') }}

    This workflow combines AWS Bedrock AI capabilities with Ansible automation, ensuring agility, compliance, and operational visibility in a single process.

    Use case 3: Comprehensive DevOps Guru monitoring, diagnostics, and audit reporting

    For example, a compliance mandate requires that all resources associated with the core WebBackend tag must be monitored by DevOps Guru. The operations team automate the following steps:

    1. Configure a resource collection for the WebBackend service to ensure all relevant resources are monitored.
    2. Notify a Simple Notification Service (SNS) topic of high-severity alerts for operational visibility.
    3. Retrieve a full diagnostic package (anomalies and recommendations) for recently closed insights for post-mortem reporting.

    Pro tip: Optionally generate a structured audit report that can be:

    1. Uploaded to an S3 bucket for compliance and traceability.
    2. Attached to a ServiceNow or Jira ticket for operational follow-up or review.

    Outcome: After execution:

    • All WebBackend resources are actively monitored.
    • Alerts are routed automatically to the designated SNS topic.
    • A detailed diagnostic and audit report is generated for compliance and post-mortem analysis.
    ---
    - name: DevOps Guru Monitoring, Configuration, and Diagnostics
      hosts: localhost
      connection: local
      gather_facts: false
      vars:
        ops_sns_arn: "arn:aws:sns:us-east-1:123456789012:OpsAlertsTopic"
      tasks:
        - name: Configure Resource Collection to Monitor WebBackend Service
          amazon.ai.devopsguru_resource_collection:
            state: present
            tags:
              - app_boundary_key: "Devops-guru-Service"
                tag_values: ["WebBackend"]
            notification_channel_config:
              sns:
                topic_arn: "{{ ops_sns_arn }}"
              filters:
                severities: ["HIGH"]
                message_types: ["NEW_INSIGHT", "SEVERITY_UPGRADED"]
          register: config_result
        - name: Audit - Check the Configured Resource Collection Details          
          amazon.ai.devopsguru_resource_collection_info:
            resource_collection_type: "AWS_TAGS"
          register: collection_audit
        - name: Diagnostics - List Detailed Info for Insights
          amazon.ai.devopsguru_insight_info:
              status_filter:
                closed:
                  type: 'REACTIVE'
                  end_time_range:
                    from_time: "2025-10-20"
                    to_time: "2025-10-22"
              include_recommendations:
                locale: EN_US
              include_anomalies:
                filters:
                  service_collection:
                    service_names:
                      - EC2
            register: insight_details
        
        - name: Build Audit Report
          set_fact:
            audit_report:
              timestamp: "{{ lookup('pipe', 'date +%Y-%m-%dT%H:%M:%S') }}"
              resource_collection_status: "{{ config_result.msg }}"
              monitored_tags: "{{ collection_audit.resource_collection.tags | default('None') }}"
              insight_count: "{{ insight_details.reactive_insights | length }}"
              insights: "{{ insight_details.reactive_insights | default([]) }}"
        - name: Render DevOps Guru audit report from template
          ansible.builtin.template:
            src: "templates/devopsguru_audit_report.json.j2"
            dest: "/tmp/devopsguru_report.json"

    This workflow demonstrates a full AI-to-ops compliance and monitoring lifecycle, combining AWS DevOps Guru, Ansible automation, and optional audit/reporting integration.

    Final thoughts

    The launch of the Red Hat Ansible Certified Content Collection amazon.ai for generative AI is more than just the addition of new modules; it's an important step in bridging the gap between AI innovation and enterprise operations. Whether you're scaling foundation models, orchestrating intelligent agents, or monitoring complex systems with DevOps Guru, this collection lets you treat AI as code. That means deployments are repeatable, configuration drift is minimized, and auditability is built in from day one.

    Explore the full use case playbooks in the GitHub Repository. Migrating your configurations to these automated workflows is the first step toward building a fully automated AI ecosystem.

    Looking to get started with Ansible for Amazon Web Services?

    • Check out the Amazon Web Services Guide
    • Try out the hands-on interactive labs
    • Read the e-book: Using automation to get the most from your public cloud

    Where to go next

    • Visit us at the Red Hat booth at AWS re:Invent 2025
    • Check out Red Hat Summit 2025!
    • For further reading and information, visit other blogs related to Ansible Automation Platform.
    • Check out the YouTube playlist for everything about Ansible Collections.
    • Are you new to Ansible automation and want to learn? Check out our getting started guide on developers.redhat.com.

    Recent Posts

    • Confidential virtual machine storage attack scenarios

    • Introducing virtualization platform autopilot

    • Integrate zero trust workload identity manager with Red Hat OpenShift GitOps

    • Best Practice Configuration and Tuning for Linux and Windows VMs

    • Red Hat UBI 8 builders have been promoted to the Paketo Buildpacks organization

    What’s up next?

    Whether you're new to Ansible or an experienced user, these learning paths, self-paced labs, and other resources will help you reach the next phase of your automation journey with confidence.

    Get started with Ansible Automation Platform
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility