Synthetic data for RAG evaluation: Why your RAG system needs better testing
Build better RAG systems with SDG Hub. Generate high-quality question-answer-context triplets to benchmark retrievers and track LLM performance over time.
Build better RAG systems with SDG Hub. Generate high-quality question-answer-context triplets to benchmark retrievers and track LLM performance over time.
Explore big versus small prompting in AI agents. Learn how Red Hat's AI quickstart balances model capability, token costs, and task focus using LangGraph.
Learn how ATen serves as PyTorch's C++ engine, handling tensor operations across CPU, GPU, and accelerators via a high-performance dispatch system and kernels.
Learn how integrating Red Hat Lightspeed Model Context Protocol (MCP) and Red Hat Lightspeed advisor optimizes infrastructure health management.
Learn how vibe coding and spec-driven development are shaping the future of software development. Discover the benefits and challenges of each approach, and how to combine them for sustainable software development.
Learn how to design agentic workflows, and how the Red Hat AI portfolio supports production-ready agentic systems across the hybrid cloud.
Automate Ansible error resolution with AI. Learn how to ingest logs, group templates, and generate step-by-step solutions using RAG and agentic workflows.
Learn how to integrate model context protocol (MCP) servers for Red Hat Enterprise Linux and Red Hat Lightspeed into your IDE for data-driven troubleshooting and proactive analytics. Improve your development workflow with actionable intelligence from natural language queries.
One conversation in Slack and email, real tickets in ServiceNow. Learn how the multichannel IT self-service agent ties them together with CloudEvents + Knative.
Learn how to deploy Voxtral Mini 4B Realtime, a streaming automatic speech recognition model for low-latency voice workloads, using Red Hat AI Inference Server.
Headed to DevNexus? Visit the Red Hat Developer booth on-site to speak to our expert technologists.
Learn how to integrate OpenShift Lightspeed into an IDE using the MCP server to generate configurations and query cluster resources without leaving your IDE.
See how to use Apache Camel to turn LLMs into reliable text-processing engines for generative parsing, semantic routing, and "air-gapped" database querying.
Learn about NVFP4, a 4-bit floating-point format for high-performance inference on modern GPUs that can deliver near-baseline accuracy at large scale.
Red Hat OpenShift 4.21 introduces AI-driven insights, automated security signing, and local development tools to help you build and deploy faster.
Explore how Red Hat OpenShift AI uses LLM-generated summaries to distill product reviews into a form users can quickly process.
Deploy an enterprise-ready RAG chatbot using OpenShift AI. This quickstart automates provisioning of components like vector databases and ingestion pipelines.
Install the Kiali Model Context Protocol server into OpenShift Lightspeed to transform Kiali into an active data source and streamline AI-driven operations.
Automate Red Hat OpenShift bare metal installs with a conversational AI agent. This MCP-driven solution simplifies manual configuration and validation steps.
Explore the pros and cons of on-premises and cloud-based language learning models (LLMs) for code assistance. Learn about specific models available with Red Hat OpenShift AI, supported IDEs, and more.
Explore the architecture and training behind the two-tower model of a product recommender built using Red Hat OpenShift AI.
Discover the self-service agent AI quickstart for automating IT processes on Red Hat OpenShift AI. Deploy, integrate with Slack and ServiceNow, and more.
Learn how to build AI-enabled applications for product recommendations, semantic product search, and automated product review summarization with OpenShift AI.
Deploy an Oracle SQLcl MCP server on an OpenShift cluster and use it with the OpenShift AI platform in this AI quickstart.
Discover the AI Observability Metric Summarizer, an intelligent, conversational tool built for Red Hat OpenShift AI environments.