Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • View All Red Hat Products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Secure Development & Architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • Product Documentation
    • API Catalog
    • Legacy Documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

How to use Debezium SMT with Groovy to filter routing events

July 6, 2023
Diego Neves
Related topics:
Programming languages & frameworksGitOpsKafka
Related products:
Red Hat Enterprise Linux

Share:

    After configuring my Kafka Connect Image with Debezium, demonstrated in Hugo Guerrero's article Improve your Kafka Connect builds of Debezium, I needed to configure a type of filter to only bring certain events from the database table to my topics. I was able to do this using Debezium SMT with Groovy.

    What is Debezium SMT?

    Debezium SMT (single message transform) is a filter feature provided by Debezium that is used to process only records that you find relevant. To do that, you need to include plugins the implementations of the JSR223 API (Scripting for the Java Platform) inside your Kafka Connect Image.

    Note that Debezium does not come with an JSR 223 implementation, so you will need to provide the libs to use this feature. We will use the Groovy implementation of JSR 223, so you can download all the relevant jars from the Groovy website.

    There are other JSR 223 implementations that you can use, however, we will not cover them here. If you want information about this, go to Debezium documentation.

    Download the files

    First of all, you will need your database plugin (i.e., SQL Server or MySQL) from the download page. Figure 1 illustrates the Red Hat software downloads page.

    A screenshot of the Red Hat software download page.
    Figure 1: The Red Hat software download page.

    That is the connector will need to put in your Kafka Connect to work with MySQL CDC. You will also need to download the scripting transformation package.

    With this in place, go to the Groovy website and download the zip that contains all the JAR's files, as shown in Figure 2.

    A screenshot of the Groovy download page.
    Figure 2: The Groovy download page.

    Figure 3 shows the three zip files that we will unzip in the next steps.

    A screenshot of the zip files dowloaded for Debezium and Groovy.
    Figure 3: The zip files dowloaded for Debezium and Groovy.

    Creating the image

    Unzip the files dowloaded in the last step. Use the SQL server plugin, as shown in Figure 4.

    A screenshot of the unzipped debezium and groovy folders.
    Figure 4: The unzipped debezium and groovy folders.

    Go to the debezium-scripting folder and copy the debezium-scripting-1.9.7.Final...jar and place it inside the debezium-connector-sqlserver folder.

    Then go to the groovy-4.0.11/lib folder and copy the jars groovy-4.0.11.jar and groovy-jsr223-4.0.11.jar. Place them in the debezium-connector-sqlserver folder. At this point, your folder should look like Figure 5. Keep in mind that your versions may be different. These are the versions available at the time of this article.

    A screenshot of the plugin folder with all the necessary jars.
    Figure 5: The plugin folder with all the necessary jars.

    Now, zip the debezium-connector-sqlserver folder and place this zip file into your nexus or Git. Then use this as your artifact, as shown in the previously mentioned Hugo Guerrero article.

    How to use transformations

    To use this feature, create your Kafka connectors and configure them to use the transformations like the following:

    kind: KafkaConnector
    apiVersion: kafka.strimzi.io/v1beta2
    metadata:
      name: sql-connector-for-inserts
      labels:
        strimzi.io/cluster: my-connect-cluster
      namespace: kafka
    spec:
      class: io.debezium.connector.sqlserver.SqlServerConnector
      tasksMax: 1
      config:
        database.hostname: "server.earth.svc"
        database.port: "1433"
        database.user: "sa"
        database.password: "Password!"
        database.dbname: "InternationalDB"
        table.whitelist: "dbo.Orders"
        database.history.kafka.bootstrap.servers: "my-cluster-kafka-bootstrap:9092"
        database.server.name: "internation-db-insert-topic" <-- # This property need to have a unique value
        database.history.kafka.topic: "dbhistory.internation-db-insert-topic" <-- # This property need to have a unique value
        #### Here start the transforms feature, using the condition where operation is equal 'c', only 
        #### events of that type will be routed to the topic created by this connector.
        transforms: filter 
        transforms.filter.language: jsr223.groovy 
        transforms.filter.type: io.debezium.transforms.Filter 
        transforms.filter.condition: value.op == 'c'
        transforms.filter.topic.regex: internation-db-insert-topic.dbo.Orders\
        #### end of transforms filter
        tombstones.on.delete: 'false'

    Summary

    This article demonstrated how to configure a Kafka connect image to use Debezium SMT with Groovy and showed you how to use transformations and filters to route events between topics. For more information, refer to the Debezium documentation.

    Related Posts

    • Capture database changes with Debezium Apache Kafka connectors

    • Red Hat advances Debezium CDC connectors for Apache Kafka support to Technical Preview

    • Serialize Debezium events with Apache Avro and OpenShift Service Registry

    • Use Groovy to customize the Maven build process

    Recent Posts

    • Cloud bursting with confidential containers on OpenShift

    • Reach native speed with MacOS llama.cpp container inference

    • A deep dive into Apache Kafka's KRaft protocol

    • Staying ahead of artificial intelligence threats

    • Strengthen privacy and security with encrypted DNS in RHEL

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue