This article outlines new plug-ins and sidecar containers in Red Hat Developer Hub that provide integration between Developer Hub and Red Hat OpenShift AI. This integration lets you automatically transfer AI model metadata managed by OpenShift AI into Developer Hub’s Software Catalog.
Automating AI model metadata in Developer Hub
Red Hat Developer Hub is an internal developer platform (IDP) based on Backstage. It helps development teams and organizations increase productivity by combining parts of the development process into a single portal. Red Hat Developer Hub provides a software catalog, based on the Backstage catalog, that acts as a central library of applications, APIs, and software resources used by development teams within an organization.
With Red Hat Developer Hub software catalog as a starting point, in November of 2024 we shared our approach to mapping metadata for AI models to the software catalog, including:
- Connection information for running model instances
- Version, lifecycle, or other descriptive indicators
- Model cards
- The APIs (FastAPI, Swagger docs) for interacting with the model servers
If you revisit that post, you’ll see that:
- Model servers are stored as Components with a type of
model-server. - Models are stored as Resources with a type of
ai-model. - Details of the API that a model server provides are stored as API entities.
And with this article, we add the model cards often provided on various AI model hosting sites—stored as TechDocs and associated with the Components and Resources just mentioned.
In the November article, we constructed two GitHub repositories to illustrate the proposed mapping. We then described using the familiar Developer Hub dashboard method to import Components, Resources, and API entities defined in GitHub into the catalog.
But is a source code repository system like Git the source of truth for your AI model metadata? If you are a Red Hat OpenShift AI user, the answer is most likely no. Instead, this information is captured in OpenShift AI components like the Model Catalog, Model Registry, and even the InferenceServices that manage the Pods where your AI model servers are running.Ideally, administrators want to avoid the steps of duplicating AI model metadata already maintained in OpenShift AI into a source code repository with catalog-info.yaml files.
To solve this, Red Hat Developer Hub 1.8 introduces a new developer preview offering. This offering automatically imports information directly from Red Hat OpenShift AI’s various sources and normalizes the data into Developer Hub entities: Component, Resource, TechDoc, and API in the Developer Hub software catalog. With this addition, Developer Hub expands its role as your organization’s centralized developer portal. It provides unified views of the AI infrastructure, tooling, services, and documentation OpenShift AI provides.
Figures 1, 2, and 3 show AI-related Components, API entities with Swagger Docs, and TechDocs from model cards.



Getting started with the OpenShift AI connector for Red Hat Developer Hub
First, let's look at the system overview for the OpenShift AI Connector for Red Hat Developer Hub.
System overview
The Openshift AI connector for Red Hat Developer Hub consists of two dynamic plug-ins and three sidecar containers that run in the Developer Hub pod (illustrated in Figure 4).

The dynamic plug-ins include:
- @red-hat-developer-hub/backstage-plugin-catalog-backend-module-model-catalog package: A Backstage Entity Provider extension that fetches the AI model metadata retrieved by the sidecar containers and creates/updates the various Component, Resource, and API entities.
- @red-hat-developer-hub/backstage-plugin-catatlog-techdoc-url-reader-backend: A Backstage URLReader extension that fetches the AI model cards retrieved by the sidecar containers and creates or updates the TechDocs for AI-related Components and Resources.
The sidecar containers include:
- A
rhoai-normalizercontainer that serves both as a KServe InferenceService Kubernetes controller (for monitoring the state of running model servers on the cluster) and as a client of the KubeFlow ModelRegistry and ModelCatalog that OpenShift AI provides. These two components are where the AI platform engineer maintains the AI model metadata. Data from those components are normalized into a format understood by the two dynamic plug-ins. - A
storagecontainer provides a REST endpoint that maintains a ConfigMap calledbac-import-modelin the Developer Hub namespace. This ConfigMap serves as a cache of the AI model metadata retrieved from OpenShift AI. Therhoai-normalizercontainer interacts with thestorageREST API to create, update, or delete AI model metadata as changes occur in OpenShift AI. - A
locationcontainer provides a REST endpoint the entity provider plug-in uses to fetch AI model metadata and model cards. Thestoragecontainer calls thelocationREST endpoint to update the available model metadata and model cards.
As the system evolves and grows, we see:
- Creating more “normalizer” types that handle other sources of AI model metadata.
- Creating more back-end storage types that are managed by a
storagecontainer that implements the same REST API as our initial ConfigMap-based offering.
Install the plug-ins and sidecar containers
Here's how to install the plug-ins and sidecars, including prerequisites and an installation overview.
Prerequisites
- Red Hat Developer Hub 1.8 or later
- Red Hat OpenShift AI:
- To have model cards from the Model Catalog imported as TechDocs, you need version 2.25 or later. You must also enable the Model Catalog dashboard and a Model Registry (they are both off by default).
- For the rest of the features, version 2.20 or later will suffice. Enabling the Model Registry and its associated dashboard allows a user to more directly customize AI model metadata.
Installation overview
- There are two dynamic plug-ins in the Red Hat Developer Hub Marketplace, @red-hat-developer-hub/backstage-plugin-catalog-backend-module-model-catalog and @red-hat-developer-hub/backstage-plugin-catalog-techdoc-url-reader-backend, which you add to your dynamic plug-in ConfigMap.
- There is a set of Kubernetes RBAC manifests you’ll apply in the namespace where your Developer Hub deployment resides, and a set of Kubernetes RBAC manifests you’ll apply where the Red Hat OpenShift AI Model Registry/Catalog deployment resides.
- The specification of the three sidecar containers will need to be added to either your Backstage CustomResource instance (if you installed using the Operator) or Deployment instance (if you installed using the Helm chart).
- Finally, you’ll update the Backstage
app-configConfigMap and enable the Catalog Entity Provider that ships with the @red-hat-developer-hub/backstage-plugin-catalog-backend-module-model-catalog dynamic plug-in.
The complete installation guide for the Developer Hub OpenShift AI connector is part of the Developer Hub 1.8 documentation. We will update this blog post with a link to the installation guide when it is available.
Updating what the OpenShift AI connector imports into Developer Hub
The default metadata imported from OpenShift AI is as follows:
- InferenceServices (Component type
model-server):- URL of the OpenShift Route (if exposed).
- URL of the Kubernetes Service.
- Authentication requirement status.
- Model Registry (Resource type
ai-model):- Model description, artifact URIs, and author/owner information.
- Model Catalog:
- Links to the model card (as Developer Hub TechDocs).
- Model license URL.
Given this, the engineers maintaining OpenShift AI might be different from the engineers maintaining Developer Hub, depending on your organization and how the OpenShift clusters are administered. If this is the case, coordination between the Developer Hub and OpenShift AI maintainers is required to set metadata in OpenShift AI so it propagates to Developer Hub.
Using the AI Model Software Catalog updates from a Developer Hub template
Now that the AI models and model servers are imported into the Developer Hub Software Catalog, what can you do with them?
Accessing REST API documentation or the model cardsh within Developer Hub adheres to the core Backstage tenet: providing a central location for information for development teams.
Additionally, in conjunction with the Developer Hub version 1.8 developer preview offering of the AI Model Catalog, we have updated the example AI application templates first announced in November 2024 to provide an alternative to typing in the AI model endpoint information. This alternative queries the Developer Hub Software Catalog for the Resources and Components imported using our prescribed format:
- The common
EntityPickerused by each of the sample applications can be seen here. - Associated steps that obtain the
model-serverComponent associated with anai-modelResource can be seen here. - The final use of these building blocks in each specific application will look something like this.
This change simplifies the task of looking up AI model connection information for AI application developers who use Developer Hub and these templates. Figure 5 shows the simple yet powerful drop-down.

Conclusion
Building on the prior blog post’s definition of a model catalog structure for AI models and model servers, we believe this developer preview shows the value provided by the OpenShift AI connector’s integration between Red Hat Developer Hub and Red Hat OpenShift AI. Metadata is pulled from the source of truth from within the OpenShift AI platform. It is then imported in a way that’s familiar to existing Red Hat Developer Hub or Backstage users and allows easy integration with other Developer Hub features.
Explore more topics: