Data Science

Featured image for AI/ML
Article

How to install KServe using Open Data Hub

JooHo Lee

Learn a simplified method for installing KServe, a highly scalable and standards-based model inference platform on Kubernetes for scalable AI. 

Featured image for ML with OpenShift
Article

Implement AI-driven edge to core data pipelines

Bruno Meseguer

The Edge to Core Pipeline Pattern automates a continuous cycle for releasing and deploying new AI/ML models using Red Hat build of Apache Camel and more.

AI Lab
Article

AI Lab Recipes

Sally O'Malley +1

The AI Lab Recipes repository offers recipes for building and running containerized AI and LLM applications to help developers move quickly from prototype to production.

Page Thumbnail
Page

AI/ML Workloads

Applications based on machine learning and deep learning, using structured and unstructured data as the fuel to drive these applications.

Page Thumbnail
Page

Red Hat's AI/ML Platforms & Developer Tools

Red Hat provides AI/ML across its products and platforms, giving developers a portfolio of enterprise-class AI/ML solutions to deploy AI-enabled applications in any environment, increase efficiency, and accelerate time-to-value.

Red Hat Developer - Build Here Go Anywhere
Page

Red Hat Developers

Join Red Hat Developer for the software and tutorials to develop cloud applications using Kubernetes, microservices, serverless and Linux.

Featured image for: Machine learning with Jupyter notebooks.
Article

Perform inference using Intel OpenVINO Model Server on OpenShift

Audrey Reznik +2

In this article, you will learn how to perform inference on JPEG images using the gRPC API in OpenVINO Model Server in OpenShift. Model servers play an important role in smoothly bringing models from development to production. Models are served via network endpoints which expose an APIs to run predictions.