debezium

Capture IBM Db2 data changes with Debezium Db2 connector

Capture IBM Db2 data changes with Debezium Db2 connector

This article introduces the new Debezium Db2 connector for change data capture, now available as a technical preview from Red Hat Integration. Get a quick overview of using Debezium in a Red Hat AMQ Streams Kafka cluster, then find out how to use the new Db2 connector to capture row-level changes in your Db2 database tables.

Note: Change data capture, or CDC, is a well-established software design pattern for monitoring and capturing data changes in a database. CDC captures row-level changes to database tables and passes corresponding change events to a data streaming bus. Applications can read the change-event streams and access change events in the order that they happened.

Continue reading “Capture IBM Db2 data changes with Debezium Db2 connector”

Share
Kubernetes-native Apache Kafka with Strimzi, Debezium, and Apache Camel (Kafka Summit 2020)

Kubernetes-native Apache Kafka with Strimzi, Debezium, and Apache Camel (Kafka Summit 2020)

Apache Kafka has become the leading platform for building real-time data pipelines. Today, Kafka is heavily used for developing event-driven applications, where it lets services communicate with each other through events. Using Kubernetes for this type of workload requires adding specialized components such as Kubernetes Operators and connectors to bridge the rest of your systems and applications to the Kafka ecosystem.

In this article, we’ll look at how the open source projects Strimzi, Debezium, and Apache Camel integrate with Kafka to speed up critical areas of Kubernetes-native development.

Note: Red Hat is sponsoring the Kafka Summit 2020 virtual conference from August 24-25, 2020. See the end of this article for details.

Continue reading “Kubernetes-native Apache Kafka with Strimzi, Debezium, and Apache Camel (Kafka Summit 2020)”

Share
Build a simple cloud-native change data capture pipeline

Build a simple cloud-native change data capture pipeline

Change data capture (CDC) is a well-established software design pattern for a system that monitors and captures data changes so that other software can respond to those events. Using KafkaConnect, along with Debezium Connectors and the Apache Camel Kafka Connector, we can build a configuration-driven data pipeline to bridge traditional data stores and new event-driven architectures.

This article walks through a simple example.

Continue reading “Build a simple cloud-native change data capture pipeline”

Share
Change data capture for microservices without writing any code

Change data capture for microservices without writing any code

Want to smoothly modernize your legacy and monolithic applications to microservices or cloud-native without writing any code? Through this demonstration, we show you how to achieve the following change data capture scenario between two microservices on Red Hat OpenShift using the combination of Syndesis, Strimzi, and Debezium.

architecture diagram

Continue reading “Change data capture for microservices without writing any code”

Share
Change data capture with Debezium: A simple how-to, Part 1

Change data capture with Debezium: A simple how-to, Part 1

One question always comes up as organizations moving towards being cloud-native, twelve-factor, and stateless: How do you get an organization’s data to these new applications? There are many different patterns out there, but one pattern we will look at today is change data capture. This post is a simple how-to on how to build out a change data capture solution using Debezium within an OpenShift environment. Future posts will also add to this and add additional capabilities.

Continue reading Change data capture with Debezium: A simple how-to, Part 1

Share
Capture database changes with Debezium Apache Kafka connectors

Capture database changes with Debezium Apache Kafka connectors

Change data capture, or CDC, is a well-established software design pattern for a system that monitors and captures the changes in data so that other software can respond to those changes. CDC captures row-level changes to database tables and passes corresponding change events to a data streaming bus. Applications can read these change event streams and access these change events in the order in which they occurred.

Thus, change data capture helps to bridge traditional data stores and new cloud-native event-driven architectures. Meanwhile, Debezium is a set of distributed services that captures row-level changes in databases so that applications can see and respond to those changes. This general availability (GA) release from Red Hat Integration includes the following Debezium connectors for Apache Kafka: MySQL Connector, PostgreSQL Connector, MongoDB Connector, and SQL Server Connector.

Continue reading “Capture database changes with Debezium Apache Kafka connectors”

Share
Low-code microservices orchestration with Syndesis

Low-code microservices orchestration with Syndesis

Recently I wrote about decoupling infrastructure code from microservices. I found that Apache Camel and Debezium provided the middleware I needed for that project, with minimal coding on my end. After my successful experiment, I wondered if it would be possible to orchestrate two or more similarly decoupled microservices into a new service–and could I do it without writing any code at all? I decided to find out.

This article is a quick dive into orchestrating microservices without writing any code. We will use Syndesis (an open source integration platform) as our orchestration platform. Note that the examples assume that you are familiar with Debezium and Kafka.

Continue reading “Low-code microservices orchestration with Syndesis”

Share
Red Hat advances Debezium CDC connectors for Apache Kafka support to Technical Preview

Red Hat advances Debezium CDC connectors for Apache Kafka support to Technical Preview

After a couple of months in Developer Preview, the Debezium Apache Kafka connectors for change data capture (CDC) are now available as a Technical Preview as part of the Q4 release of Red Hat Integration. Technology Preview features provide early access to upcoming product innovations, enabling you to test functionality and provide feedback during the development process.

Continue reading Red Hat advances Debezium CDC connectors for Apache Kafka support to Technical Preview

Share
Decoupling microservices with Apache Camel and Debezium

Decoupling microservices with Apache Camel and Debezium

The rise of microservices-oriented architecture brought us new development paradigms and mantras about independent development and decoupling. In such a scenario, we have to deal with a situation where we aim for independence, but we still need to react to state changes in different enterprise domains.

I’ll use a simple and typical example in order to show what we’re talking about. Imagine the development of two independent microservices: Order and User. We designed them to expose a REST interface and to each use a separate database, as shown in Figure 1:

Diagram 1 - Order and User microservices

Figure 1: Order and User microservices.

Continue reading “Decoupling microservices with Apache Camel and Debezium”

Share
Developer preview of Debezium Apache Kafka connectors for Change Data Capture (CDC)

Developer preview of Debezium Apache Kafka connectors for Change Data Capture (CDC)

With the release of Red Hat AMQ Streams 1.2, Red Hat Integration now includes a developer preview of Change Data Capture (CDC) capabilities to enable data integration for modern cloud-native microservices-based applications. CDC features are based on the upstream project Debezium and are natively integrated with Apache Kafka and Strimzi to run on top of Red Hat OpenShift Container Platform, the enterprise Kubernetes, as part of the AMQ Streams release.

Continue reading “Developer preview of Debezium Apache Kafka connectors for Change Data Capture (CDC)”

Share