As we kick off the new year, we're taking a moment to look back at the content that resonated most with our community of developers, architects, and IT practitioners. In 2024, the rapid rise of generative AI dominated the conversation. In 2025, we saw that momentum shift toward practical, high-performance implementation.
The year’s top articles reflect a community focused on moving beyond the basics. We saw a surge of interest in building agentic systems, benchmarking LLM performance with tools like vLLM and Ollama, and optimizing development environments through the Windows Subsystem for Linux (WSL). Beyond AI, foundational technologies remained a priority, with deep dives into the latest GCC 15 features and advanced Linux networking.
Whether you are here to boost your AI coding practices or orchestrate complex multicluster environments, these are ten stories that shaped the Red Hat Developer experience in 2025.
#10: An overview of virtual routing and forwarding (VRF) in Linux
Author Antoine Tenart, a specialist in Linux kernel networking, provides a comprehensive look at VRF, a lightweight solution for isolating Layer 3 traffic. This guide explains how to create independent routing and forwarding domains to support multi-tenancy and overlapping networks without the overhead of full network namespaces.
Read it: An overview of virtual routing and forwarding (VRF) in Linux
#9: How we optimized vLLM for DeepSeek-R1
By Michael Goin, Robert Shaw, Nick Hill, Tyler Smith, and Lucas Wilkinson
The performance engineering team from Neural Magic (now part of Red Hat) details their work scaling the massive DeepSeek-R1 model. This technical deep dive explores kernel-level optimizations like Multi-Head Latent Attention (MLA) and Multi-Token Prediction (MTP) that allow this 671B parameter model to run efficiently in production environments.
Read it: How we optimized vLLM for DeepSeek-R1
#8: How to build a simple agentic AI server with MCP
By Saroj Paudel
AI engineer Saroj Paudel delivers a hands-on tutorial for connecting AI agents to real-world data using the Model Context Protocol (MCP). Using a weather-fetching tool as a practical example, this article demonstrates how developers can build secure, standardized adapters that allow LLMs to interact with external APIs and databases.
Read it: How to build a simple agentic AI server with MCP
#7: A quick look at MCP with large language models and Node.js
Michael Dawson, Red Hat’s Node.js lead and IBM’s community lead for the project, explores the interoperability of MCP across different frameworks. He shows how tools written once in TypeScript can be seamlessly consumed by both the Bee agent framework and Ollama, proving the potential of MCP to eliminate the need for custom wrappers.
Read it: A quick look at MCP with large language models and Node.js
#6: How to automate multi-cluster deployments using Argo CD
By Radu Domnu, Ilias Raftoulis
GitOps architects Radu Domnu and Ilias Raftoulis present a strategic approach to managing application life cycles across multiple OpenShift clusters. They break down the pros and cons of standalone versus hub-and-spoke architectures, providing a roadmap for platform teams to automate complex multi-tenant environments using Argo CD.
Read it: How to automate multi-cluster deployments using Argo CD
#5: How spec-driven development improves AI coding quality
Rich Naszcyniec introduces spec coding, a structured alternative to vibe coding. By defining functional and language-specific specifications first, engineers can guide AI coding assistants to produce code with over 95% accuracy, ensuring that AI-generated output is maintainable and adheres to corporate standards.
Read it: How spec-driven development improves AI coding quality
#4: New C++ features in GCC 15
GCC C++ front-end maintainer Marek Polacek previews the release of GCC 15.1. This post details new C++26 features, including pack indexing, variadic friends, and the ability to provide specific reasons for deleted functions, helping developers prepare for the next generation of the C++ language.
Read it: New C++ features in GCC 15
#3: Ollama vs. vLLM: A deep dive into performance benchmarking
AI performance engineer Harshith Umesh settles the debate between popular inference engines with raw data. By benchmarking throughput and latency on NVIDIA A100 hardware, this post demonstrates why Ollama is ideal for local prototyping while vLLM remains the clear choice for high-concurrency enterprise production.
Read it: Ollama vs. vLLM: A deep dive into performance benchmarking
#2: 6 usability improvements in GCC 15
David Malcolm, a primary contributor to GCC’s diagnostic systems, explains his work making compiler errors easier to read. Highlights include ASCII-art execution paths for static analysis, a new SARIF machine-readable output, and a prettier look for notoriously complex C++ template errors.
Read it: 6 usability improvements in GCC 15
#1: Getting started with RHEL on WSL
By Eliane Pereira, Sanne Raymaekers, Terry Bowling
Red Hat Enterprise Linux experts Eliane Pereira, Sanne Raymaekers, and Terry Bowling explain how to bring the world’s leading enterprise Linux platform directly to the Windows desktop. This top-ranked guide covers creating custom RHEL images via the Lightspeed image builder and setting up a seamless workflow between Windows and Linux environments.
Read it: Getting started with RHEL on WSL
Expand your technical toolkit
We also released new long-form e-book and cheat sheet downloads in 2025. While our articles provide timely insights, these resources offer the in-depth guidance and tactical references you need to master a new stack or navigate complex migrations.
Read on for the top additions from last year.
New e-books
Our e-books, written by Red Hat subject matter experts, can help you bridge the gap between getting started and being production ready:
- Applied AI for Enterprise Java Development: This practical new guide helps Java developers bridge the gap between enterprise frameworks and generative AI, focusing on production-ready patterns, vector databases, and the LangChain4j ecosystem.
- Open source AI for developers: This book provides a blueprint for building AI-infused applications using open source models and transparent workflows.
- Red Hat Certified Engineer (RHCE) Ansible Automation Study Guide: Whether you're preparing for certification or just hardening your automation skills, this guide is the definitive resource for modern Ansible practices.
- OpenShift for .NET Developers: Written for developers bringing C# and .NET workloads to Linux containers, this book covers everything from architectural shifts to CI/CD integration.
New cheat sheets
When you're in the middle of a deployment, you need the right command at your fingertips. Our cheat sheets are designed to be your quick-reference companion for the most critical tasks:
- Get up to speed on the latest flagship release with the Red Hat Enterprise Linux 10 cheat sheet.
- Learn how to pair AI and Node.js and run your dev environment efficiently with RHEL in WSL.
- For those navigating the shifting virtualization landscape, we’ve released OpenShift Virtualization for VMware administrators.
- Enable air-gapped deployments with the OpenShift disconnected installation cheat sheet and automate your system health checks with the Red Hat Lightspeed API cheat sheet.
Looking ahead to 2026
From the low-level compiler optimizations in GCC 15 to the high-level orchestration of agentic AI, the common thread across all these articles is bridging the gap between community innovation and enterprise stability.
At Red Hat Developer, our goal remains the same: to provide you with the deep, engineering-led insights you need to create better software. We are grateful to the engineers and maintainers who took the time to share their expertise this past year, and to you, the builders and makers who continue to push these technologies to their limits. We can't wait to show you what we're building in 2026.