APIs

The words Red Hat JBoss Enterprise Application Platform in front of a dark gray background with arrows.
Article

JBoss EAP XP 6 is here

Mike Ward

Explore new features in Red Hat JBoss EAP XP 6, including upgrades to MicroProfile 7, MicroProfile LRA and multi-app support, and observability tools.

Red Hat Advanced Developer Suite
Page

Red Hat Developers

Your Red Hat Developer membership unlocks access to product trials, learning resources, events, tools, and a community you can trust to help you stay ahead in AI and emerging tech.

Video Thumbnail
Video

The Llama Stack Tutorial: Episode Four - Agentic AI with Llama Stack

Cedric Clyburn

AI agents are where things get exciting! In this episode of The Llama Stack Tutorial, we'll dive into Agentic AI with Llama Stack—showing you how to give your LLM real-world capabilities like searching the web, pulling in data, and connecting to external APIs. You'll learn how agents are built with models, instructions, tools, and safety shields, and see live demos of using the Agentic API, running local models, and extending functionality with Model Context Protocol (MCP) servers.Join Senior Developer Advocate Cedric Clyburn as we learn all things Llama Stack! Next episode? Guardrails, evals, and more!

Video Thumbnail
Video

The Llama Stack Tutorial: Episode Two - Getting Started with Llama Stack

Cedric Clyburn

Building AI applications is more than just running a model — you need a consistent way to connect inference, agents, storage, and safety features across different environments. That’s where Llama Stack comes in. In this second episode of The Llama Stack Tutorial Series, Cedric (Developer Advocate @ Red Hat) walks through how to:- Run Llama 3.2 (3B) locally and connect it to Llama Stack- Use the Llama Stack server as the backbone for your AI applications- Call REST APIs for inference, agents, vector databases, guardrails, and telemetry- Test out a Python app that talks to Llama Stack for inferenceBy the end of the series, you’ll see how Llama Stack gives developers a modular API layer that makes it easy to start building enterprise-ready generative AI applications—from local testing all the way to production. In the next episode, we'll use Llama Stack to chat with your own data (PDFs, websites, and images) with local models.🔗 Explore MoreLlama Stack GitHub: https://github.com/meta-llama/llama-stackDocs: https://llama-stack.readthedocs.io5.

Red Hat Integration logo
Page

Red Hat Integration

Runtimes, frameworks, and services to build applications natively on Red Hat OpenShift.

Video Thumbnail
Video

Cryostat 4.0 Quarkus-native image applications

Andrew Azores

Learn how to use Cryostat 4.0’s Kubernetes API discovery configurations with Quarkus-native image applications, leveraging GraalVM's JFR and JMX observability feature support.

Red Hat Connectivity Link
Product Sub Page

Getting started

Connect, secure, and protect your distributed Kubernetes services with lightweight policy attachments.