Artificial intelligence

Featured image for Red Hat OpenShift AI.
Product Page

Red Hat OpenShift AI

A cloud service that gives data scientists and developers a powerful AI/ML

Configure a Jupyter notebook to use GPUs for AI/ML modeling
Article

Your first GPU algorithm: Scan/prefix sum

Kenny Ge

An in-depth look at a foundational GPU programming algorithm: the prefix sum. The goal is to expose the reader to the tools and language of GPU programming, rather see it only as a way to optimize certain existing subroutines.

Configure a Jupyter notebook to use GPUs for AI/ML modeling
Article

What is GPU programming?

Kenny Ge

The first of a four-part series on introductory GPU programming, this article provides a basic overview of the GPU programming model.

RHEL AI 3d cube
Product Sub Page

Download Red Hat Enterprise Linux AI

Develop, deploy, and run large language models (LLMs) in individual server environments. The solution includes Red Hat AI Inference Server, delivering fast, cost-effective hybrid cloud inference by maximizing throughput, minimizing latency, and reducing compute costs.

Featured image for Red Hat OpenShift AI.
Article

Protecting your models made easy with Authorino

JooHo Lee

This article demonstrates how to register the SKlearn runtime as a Custom ServingRuntime, deploy the iris model on KServe with OpenDataHub, and apply authentication using Authorino to protect the model endpoints.

Featured image for AI/ML
Article

How to install KServe using Open Data Hub

JooHo Lee

Learn a simplified method for installing KServe, a highly scalable and standards-based model inference platform on Kubernetes for scalable AI.