Run privileged commands more securely in OpenShift Dev Spaces
Learn how to run privileged commands in OpenShift Dev Spaces cloud development environments more securely, using OpenShift sandboxed containers (Kata containers).
Learn how to run privileged commands in OpenShift Dev Spaces cloud development environments more securely, using OpenShift sandboxed containers (Kata containers).
Most log lines are noise. Learn how semantic anomaly detection filters out repetitive patterns—even repetitive errors—to surface the genuinely unusual events.
What RHEL 8 and 9 users need to know about Python 3.9 reaching the end-of-life phase upstream.
Run the latest Mistral Large 3 and Ministral 3 models on vLLM with Red Hat AI, providing day 0 access for immediate experimentation and deployment.
Learn how to optimize AI inference costs with AWS Inferentia and Trainium chips on Red Hat OpenShift using the AWS Neuron Operator.
Simplify LLM post-training with the Training Hub library, which provides a common, pythonic interface for running language model post-training algorithms.
Learn why prompt engineering is the most critical and accessible method for customizing large language models.
Discover how LF Energy SEAPATH, an open source reference design for real-time platforms, is being used on RHEL to modernize electrical substation automation.
Now .NET 10 is available for RHEL and OpenShift, bringing new features like C# 14/F# 10 support, performance improvements, and updated APIs.
Learn how to use the new RHEL 10 soft reboot feature in image mode (bootc) to significantly reduce downtime for OS updates.
Explore the benefits of using Kubernetes, Context7, and GitHub MCP servers to diagnose issues, access up-to-date documentation, and interact with repositories.
Your Red Hat Developer membership unlocks access to product trials, learning resources, events, tools, and a community you can trust to help you stay ahead in AI and emerging tech.
Celebrate our mascot Repo's first birthday with us as we look back on the events that shaped Red Hat Developer and the open source community from the past year.
This learning path explores running AI models, specifically large language
Simplify container image management with Skopeo. Get practical examples for faster image inspection, single-command pushing, and mirroring multiple registries.
Walk through how to set up KServe autoscaling by leveraging the power of vLLM, KEDA, and the custom metrics autoscaler operator in Open Data Hub.
Enhance your Python AI applications with distributed tracing. Discover how to use Jaeger and OpenTelemetry for insights into Llama Stack interactions.
Discover the vLLM Semantic Router, an open source system for intelligent, cost-aware request routing that ensures every token generated truly adds value.
Red Hat's Developer Subscription for Teams gives organizations easy access Red Hat Enterprise Linux for their development activities.
This is a guide that demonstrates how to implement and test container signature verification in a disconnected OpenShift 4.19 cluster.
As GPU demand grows, idle time gets expensive. Learn how to efficiently manage AI workloads on OpenShift AI with Kueue and the custom metrics autoscaler.
Llama Stack offers an alternative to the OpenAI Responses API, enabling multi-step agents, RAG, and tool use on your own infrastructure with any model.
See how a custom MCP client for Docling transformed unstructured data into usable content, reducing document prep time by over 80%.
Learn how vLLM outperforms Ollama in high-performance production deployments, delivering significantly higher throughput and lower latency.
Enterprise-grade artificial intelligence and machine learning (AI/ML) for