Db2 and Oracle connectors coming to Debezium 1.4 GA
This article gives an overview of the new Red Hat Integration Debezium connectors and features included in Debezium 1.4’s general availability (GA) release. Developers now have two more options for streaming data to Apache Kafka from their datastores and a supported integration to handle data schemas.
The GA of the Db2 connector lets developers stream data from Db2. The Oracle connector, now in technical preview, provides an easy way to capture changes from one of the most popular databases. Finally, developers can delegate the Debezium schema through the fully supported integration with Red Hat Integration’s service registry.
What is Debezium?
Debezium is a set of distributed services that captures row-level database changes so that applications can view and respond to them. Debezium connectors record all events to Kafka clusters managed by Red Hat AMQ Streams. Applications use AMQ Streams to consume change events.
Debezium uses the Apache Kafka Connect framework, which transforms Debezium’s connectors into Kafka Connector source connectors. They can be deployed and managed using Kafka Connect Kubernetes custom resources provided by AMQ Streams.
The latest 1.4 release introduces new connectors, advances others, and adds features to handle schemas efficiently.
Debezium connector for Db2 goes GA
Debezium’s implementation of SQL Server inspired its Db2 connector. The SQL Server connector is based on the Abstract Syntax Notation (ASN) capture and apply agents used for SQL replication in Db2. Db2 connector agents generate change data for tables that are in capture mode. They also monitor the tables and store change events for table updates. The Debezium connector then uses an SQL interface to query change-data tables for change events.
The Db2 connector is now GA after being available in technical preview for developers to try and provide feedback, as well as after extensive testing from the Red Hat team. The Db2 connector gives Db2 for Linux users a supported mechanism to stream their database. You can read more about the connector and its configuration in the documentation.
Debezium connector for Oracle Database in developer preview
One of the most requested connector plug-ins is coming to Red Hat Integration. You can now stream your data from Oracle databases with the Debezium connector for Oracle, now in developer preview.
Oracle Database is a vital part of many organizations’ architectures, as developers created complete systems that used the database as their applications’ core. The data stored in Oracle databases is still critical for those organizations, and having access to it at the same time new applications unfold is essential for a successful migration to modern system architectures.
The Debezium connector for Oracle Database can monitor and record the row-level changes in databases on Oracle server version 12R2 and later. Debezium uses Oracle’s native LogMiner database package available as part of the Oracle Database utilities. LogMiner provides a way to query online and archived redo log files.
With the Debezium connector for Oracle, developers can stream data changes in their databases directly into AMQ Streams Apache Kafka clusters. Streaming events allows anyone to get the best of their data using modern, hybrid cloud technology. Achieve “liberation for your data.”
Integration with Service Registry
The Red Hat Integration service registry is a datastore for standard event schemas and API designs. As a developer, you can use it to decouple the structure of your data from your applications. You can also use it to share and manage your data structure using a REST interface. Red Hat’s service registry is built on the Apicurio Registry, an open source community project.
Debezium uses a JSON converter to serialize record keys and values into JSON documents. By default, the JSON converter includes a record’s message schema, so each record is quite lengthy. Another option is to use other serialization formats, like Apache Avro or Google Protocol Buffers, to serialize and deserialize each record’s keys and values. If you want to use any of these formats for serialization, Service Registry manages the data format’s message schemas and versions. You can specify the desired converter in your Debezium connector configuration. The converter then maps Kafka Connect schemas to that data format’s schemas.
Previously offered as a technical preview, the integration between Debezium and Service Registry is now fully supported. You can check how it works in my example video.
Get started with Debezium and Kafka
You can download the Red Hat Integration Debezium connectors from the Red Hat Developer portal. You can also check out Gunnar Morling’s webinar on Debezium and Kafka (February 2019) from the DevNation Tech Talks series, or his Kafka and Debezium presentation at QCon (January 2020). Finally, you can learn more about the various connectivity options available for your events on Red Hat Developer.