Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • View All Red Hat Products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Secure Development & Architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • Product Documentation
    • API Catalog
    • Legacy Documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Camel Clustered File Ingestion with JDBC and Spring

 

September 8, 2017
Mary Cochran
Related products:
Red Hat build of Apache Camel

Share:

    Reading a file is a common use for Apache Camel. From using a file to kick off a larger route to simply needing to store the file content, the ability to only read a file once is important. This is easy when you have a single server with your route deployed, but what about when you deploy your route to multiple servers. Thankfully, Camel has the concept of an Idempotent Consumer.

    A useful way to implement this is with an idempotent repository. This repository will keep track of the file being read and not allow another server to read it. It works as a lock on the file. After the file reading has been completing there are options to either remove the row from the database, which would allow for files with the same name to be ingested later or to keep the row there and not allow for files of the same name to be later ingested.

    So how do you set up an idempotent repository using JDBC and Spring?

    First, you will need to create the table and necessary spring beans.  Here I am using the JDBCMessageIdRepository. At the time of development, this bug was still an issue, so I could not use the JPA version. This has since been fixed and you can easily swap one out for the other. This example assumes you already have a bean for your datasource setup. You can use the same fileConsumerRepo bean for multiple routes in your deployment.

     CREATE TABLE CAMEL_MESSAGEPROCESSED ( processorName VARCHAR(255), messageId VARCHAR(100), createdAt TIMESTAMP );
     @Bean
     public JdbcMessageIdRepository fileConsumerRepo() {
         JdbcMessageIdRepository fileConsumerRepo = null;
         try {
             fileConsumerRepo = new JdbcMessageIdRepository(dataSource(), "fileConsumerRepo");
         } catch (Exception ex) {
             LOGGER.info("############ Caught exception inside Creating fileConsumerRepo ..." + ex.getMessage());
         }
         if (fileConsumerRepo == null) {
             LOGGER.info("############ fileConsumerRepo == null ...");
         }
         return fileConsumerRepo;
     }

    In addition, make sure to include org.apache.camel.processor.idempotent.jpa in your packages to scan or persistence unit depending on your exact setup. In my case, I was using a LocalContainerEntityManagerFactoryBean for JPA.

     @Bean
     public LocalContainerEntityManagerFactoryBean entityManagerFactory() throws NamingException {
         final LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
         em.setPersistenceUnitName("camel");
         final HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
         vendorAdapter.setDatabase(Database.ORACLE);
         vendorAdapter.setShowSql(false);
         vendorAdapter.setGenerateDdl(false);
         em.setJpaVendorAdapter(vendorAdapter);
         em.setPackagesToScan(new String[] { "com.my.package.model","org.apache.camel.processor.idempotent.jpa" });
         em.setJpaProperties(additionalProperties());
         em.setDataSource(this.dataSource());
         return em;
     }

    Now your idempotent repository is ready for a camel to use. Below is an example

    from("file:/my/test/dir?moveFailed=.error&autoCreate=true&readLockLoggingLevel=WARN&shuffle=true&readLock=idempotent&idempotentRepository=#fileConsumerRepo&readLockRemoveOnCommit=true")
        .routeId("ingestionFile")
        .convertBodyTo(String.class)
        .log(LoggingLevel.INFO, "File received");

    Let's break down the options:

    OptionMeaning
    moveFailedwhere the file should go if it fails
    autoCreateif the directory be auto created if it doesn't exist
    readLockLoggingLevelif a route cannot obtain a lock to read the file what level should that be logged at
    readLockthe type of read locking to use
    idempotentRepositoryname of the spring bean defined
    readLockRemoveOnCommitif the row should be removed upon completion of file reading

    Now deploy your route to multiple servers and watch the magic! You can do this easily using EAP or Karaf by using a port offset. If you are looking for more examples, check out this sample project put together by fellow Red Hatter Josh Reagan.


    Whether you are new to Containers or have experience, downloading this cheat sheet can assist you when encountering tasks you haven’t done lately.

    Last updated: May 31, 2024

    Recent Posts

    • A deep dive into Apache Kafka's KRaft protocol

    • Staying ahead of artificial intelligence threats

    • Strengthen privacy and security with encrypted DNS in RHEL

    • How to enable Ansible Lightspeed intelligent assistant

    • Why some agentic AI developers are moving code from Python to Rust

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue