Red Hat Integration

Runtimes, frameworks, and services to build applications natively on OpenShift

Cloud-native development that connects systems

Red Hat Integration overview


API management

Full life cycle API management is the basis of an effective API strategy. API Lifecycle Management manages all steps in the life of an API, from creation to retirement.


Serverless integration

Scale your workload to zero with small code snippets for resource optimization. Orchestrate real-time cloud events. Enhance developer productivity with auto-detect dependencies and lifecycles.


Event-driven architecture

Event-Driven Architecture (EDA) is a way of designing applications and services to respond to real-time information based on the sending and receiving of event notifications.


Data integration

Prepare your data set for microservices or AI/ML consumption. Establish secure data gateways for authorized access. Federate current data stores or add change data capture capabilities to generate cloud events.

High-speed, secure messaging with Red Hat AMQ

Red Hat AMQ is a multi-protocol messaging platform that speaks a variety of programming languages and allows developers to exchange data with high throughput and low latency.

Setting up integration environments


Learn the basics of Camel K

15 minutes | Beginner

Understand how to use this lightweight framework for writing integrations.


Exposing Apache Kafka through the HTTP Bridge

15 minutes | Intermediate

Communicate between applications and clusters with the Red Hat AMQ Streams Kafka Bridge, an API for integrating HTTP-based clients with a Kafka cluster–then use the AMQ Streams Operator to deploy the Kafka Bridge into an OpenShift cluster.


Change data capture with Debezium

20 minutes | Intermediate

Monitor your change data capture (CDC) events with Debezium, a set of distributed services that identifies row-level changes in your databases so you can respond.


Send events to Apache Kafka with Reactive Messaging

25 minutes | Beginner

Create a Quarkus application that uses the MicroProfile Reactive Messaging extension to send events to Apache Kafka. Build real-time streaming data pipelines and streaming applications that transform or react to the streams of data.

Integration in your browser


Externalize Configurations in Spring Boot microservices

15 minutes | Beginner

Learn about how we use externalized configurations to change specific values/variables without having to take down the entire application.


Develop with Thorntail on OpenShift

25 minutes | Intermediate

Use a sample Thorntail application and modify it to address microservice concerns, understand its structure, deploy it to OpenShift, and more.


Develop with Node.js on OpenShift

25 minutes | Intermediate

Take an existing sample Node.js application and modify it to address microservice concerns, deploy it to OpenShift, and exercise interfaces between Node.js, microservices, and Kubernetes.


Automate app deployment using OpenShift pipelines

30 minutes | Beginner

Learn how to use OpenShift Pipelines to automate the deployment of your applications by installing the OpenShift Pipelines operator and additional steps.

Start building your Red Hat Integration toolbox

Access and run the software components you need in your own environment.

Integration opinions


Distributed transaction patterns for microservices compared

Compare distributed transaction patterns for coordinating dual writes in a microservices architecture, then get tips for choosing the right pattern.


Improve cross-team collaboration with Camel K

Use Camel K with KameletBindings to integrate Kafka streams messaging and Kubernetes for automated cross-team interactions via Google docs and email.


Create event-based serverless functions with Python

Develop a Python-based serverless function that sends an email in response to a CloudEvent, and learn how to run your serverless function in a container.


Building reactive systems with Node.js

Find out why reactive systems are especially easy to implement with Node.js, then walk through a reactive system built with Node.js and Apache Kafka.