AI-generated product review summaries with OpenShift AI
Explore how Red Hat OpenShift AI uses LLM-generated summaries to distill product reviews into a form users can quickly process.
Explore how Red Hat OpenShift AI uses LLM-generated summaries to distill product reviews into a form users can quickly process.
Deploy an enterprise-ready RAG chatbot using OpenShift AI. This quickstart automates provisioning of components like vector databases and ingestion pipelines.
Install the Kiali Model Context Protocol server into OpenShift Lightspeed to transform Kiali into an active data source and streamline AI-driven operations.
Automate Red Hat OpenShift bare metal installs with a conversational AI agent. This MCP-driven solution simplifies manual configuration and validation steps.
Explore the pros and cons of on-premises and cloud-based language learning models (LLMs) for code assistance. Learn about specific models available with Red Hat OpenShift AI, supported IDEs, and more.
Explore the architecture and training behind the two-tower model of a product recommender built using Red Hat OpenShift AI.
Discover the self-service agent AI quickstart for automating IT processes on Red Hat OpenShift AI. Deploy, integrate with Slack and ServiceNow, and more.
Learn how to build AI-enabled applications for product recommendations, semantic product search, and automated product review summarization with OpenShift AI.
Deploy an Oracle SQLcl MCP server on an OpenShift cluster and use it with the OpenShift AI platform in this AI quickstart.
Discover the AI Observability Metric Summarizer, an intelligent, conversational tool built for Red Hat OpenShift AI environments.
Explore the latest release of LLM Compressor, featuring attention quantization, MXFP4 support, AutoRound quantization modifier, and more.
Headed to DeveloperWeek? Visit the Red Hat Developer booth on-site to speak to our expert technologists.
Discover how Red Hat Lightspeed MCP integrates AI to help simplify vulnerability management, prioritize security risks, and automate remediation planning.
This article compares the performance of llm-d, Red Hat's distributed LLM inference solution, with a traditional deployment of vLLM using naive load balancing.
Discover the advantages of using Java for AI development in regulated industries. Learn about architectural stability, performance, runtime guarantees, and more.
Whether you're just getting started with artificial intelligence or looking to deepen your knowledge, our hands-on tutorials will help you unlock the potential of AI while leveraging Red Hat's enterprise-grade solutions.
Learn how Model Context Protocol (MCP) enhances agentic AI in OpenShift AI, enabling models to call tools, services, and more from an AI application.
Take a look back at Red Hat Developer's most popular articles of 2025, covering AI coding practices, agentic systems, advanced Linux networking, and more.
Get started today by downloading Red Hat® build of Podman Desktop for Linux®, MacOS, or Windows.
Use Red Hat Lightspeed to simplify inventory management and convert natural language into inventory API queries for auditing and multi-agent automation.
Discover 2025's leading open models, including Kimi K2 and DeepSeek. Learn how these models are transforming AI applications and how you can start using them.
Learn how to deploy and test the inference capabilities of vLLM on OpenShift using GuideLLM, a specialized performance benchmarking tool.
Learn how to fine-tune a RAG model using Feast and Kubeflow Trainer. This guide covers preprocessing and scaling training on Red Hat OpenShift AI.
This guide covers the essentials of Kubernetes monitoring, including key metrics, top tools, and the role of AI in managing complex systems.
Learn how to implement retrieval-augmented generation (RAG) with Feast on Red Hat OpenShift AI to create highly efficient and intelligent retrieval systems.