Apache Kafka is one of the most used pieces of software in modern application development because of its distributed nature, high throughput, and horizontal scalability. Every day more and more organizations are adopting Kafka as the central event bus for their event-driven architecture. As a result, more and more data flows through the cluster, making the connectivity requirements rise in priority for any backlog. For this reason, the Apache Camel community released the first iteration of Kafka Connect connectors for the purpose of easing the burden on development teams.
What is Apache Camel?
The Camel community has built one of the busiest open source integration frameworks in the Apache Foundation. The Camel framework lets you quickly and easily integrate data consumer and producer systems. It implements the most used Enterprise Integration Patterns (EIPs), plus the interfaces and protocols that are used everywhere, as they emerge. Having everything under the same component base configuration lets you create the required building blocks to solve almost any integration requirement.
What is the Camel Kafka Connector project?
After maturing for almost a decade, the Camel community spun a couple of subprojects to foster innovation in areas like runtime support and container readiness. In particular, the Camel Kafka Connector subproject focuses on the use of Camel components as Kafka Connect connectors. With this in mind, they built a tiny layer between Camel and Kafka frameworks to easily use each Camel component as a Kafka connector that can be used effortlessly in the Kafka ecosystem.
This project's first release allows the community to try and share feedback on the autogenerated connectors. Despite being the first release of this subproject, most of the underlying Camel components are battle-tested and used all around the world in production scenarios.
Getting started with Camel Kafka connectors
There are more than 340+ Camel Kafka connectors to get started, ranging from AWS S3 integration to Telegram or Slack. So it should be easy to find a use case to implement a small project.
In order to simplify Kafka Connect cluster deployment, you can use the Kubernetes Operator from the Strimzi project to run on Minikube or Kind on your laptop. Then, search for and download the connector package zip version from the Maven central repository. The zip file should contain all of the library jars required to run the connector in your plugins path. Next, create a container image with your plugins on top of the Strimzi base image so you can use the image as the Kafka Connect base. And then, finally, configure the connector using the Kafka Connect REST API or the new Strimzi Kafka Connector custom resource definition (CRD).
You can also try the connectors locally on your laptop without the need for Kubernetes. You just need a locally running Kafka instance and to follow these instructions.
Want to see how it looks when running? Watch my video on integrating Slack to Kafka:
Apache Kafka is used as an event backbone in new organizations every day. Communities like Apache Camel are working on how to speed up development in key areas of the modern application, like integration. The Camel Kafka Connect project from the Apache Foundation has enabled their vast set of connectors to interact with Kafka Connect natively so that developers can start sending and receiving data from Kafka on their preferred systems.