Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • View All Red Hat Products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Secure Development & Architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • Product Documentation
    • API Catalog
    • Legacy Documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Using Clustered Camel Quartz Jobs on JBoss EAP

August 10, 2017
Mary Cochran
Related topics:
Java
Related products:
Red Hat JBoss Enterprise Application Platform

Share:

    Camel Quartz can be a useful component for jobs that need to run at a particular time every day. Recently on a client site, we had a need for about 15 different jobs that each created a differently formatted file and send each file to a particular destination. While this was straightforward to get setup on a single machine, once we started deploying our camel routes to multiple servers the jobs started to kick off on both machines. To resolve this issue we needed to create a job store.

    Step 1

    First, create your camel routes and add the camel-quartz dependency to your project. Ensure you use the job.name option on your endpoints. This will make it clearer where things are stored in the job store. For using the oracle DB we needed the following dependencies.

    <dependency>
        <groupId>com.oracle</groupId>
        <artifactId>ojdbc6</artifactId>
    </dependency>
    <dependency>
        <groupId>org.apache.camel</groupId>
        <artifactId>camel-quartz2</artifactId>
    </dependency>

    A sample route looks like the following:

    from("quartz2://mySampleCronJob?cron=0+0+16+*+*+?&trigger.timeZone=America/Chicago&job.name=myCronJob")
        .log(LoggingLevel.INFO, "Cron job kicked off");

    Step 2

    The next step in creating this job store was creating the tables. You will need to create qrtz_job_details, qrtz_triggers, qrtz_simple_triggers, qrtz_cron_triggers, qrtz_simprop_triggers, qrtz_blob_triggers, qrtz_calendars, qrtz_paused_trigger_grps_ qrtz_fired_triggers, qrtz_scheduler_state, and qrtz_locks in addition to some indexes. The script for your particular database can be found here http://www.quartz-scheduler.org/downloads/ inside the quartz-2.2.3/docs/dbTables folder.

    Step 3

    Finally, you need to configure your quartz.properties. If you are planning to only be in one environment, feel free to use a properties file named quartz.properties. However, in our case, we had multiple environments to account for and in turn a different database in each. With spring you can configure a bean for your quartz camel component with all the properties needed similar to below. Note that SPRING_ACTIVE_PROFILE is a system property that can be set on your EAP instance. From there, the code will look at the corresponding property file such as uat.properties. This file should then contain the database configuration values of db.url, db.user, and db.password.

    @Configuration
    @PropertySource(value={"classpath:${SPRING_ACTIVE_PROFILE}.properties"})
    public class SampleConfig{
    
        @Value("${db.url}")
        private String dbUrl;
        
        @Value("${db.user}")
        private String dbUser;
    
        @Value("${db.password}")
        private String dbPassword;
    
        public SampleConfig(){}
    
        @Bean
        public static PropertySourcesPlaceholderConfigurer propertyPlaceholderConfigurer() {
             return new PropertySourcesPlaceholderConfigurer();
        }
      
        @Bean
        public QuartzComponent quartz2(){
            QuartzComponent qc = new QuartzComponent();
            Properties p = new Properties();
            p.put("org.quartz.scheduler.instanceName", "myScheduler");
            p.put("org.quartz.scheduler.instanceId", "AUTO");
            p.put("org.quartz.threadPool.class", "org.quartz.simpl.SimpleThreadPool");
            p.put("org.quartz.threadPool.threadCount", "25");
            p.put("org.quartz.threadPool.threadPriority", "5");
            p.put("org.quartz.jobStore.misfireThreshold", "60000");
            p.put("org.quartz.jobStore.class", "org.quartz.impl.jdbcjobstore.JobStoreTX");
            p.put("org.quartz.jobStore.driverDelegateClass", "org.quartz.impl.jdbcjobstore.oracle.OracleDelegate");
            p.put("org.quartz.jobStore.useProperties", "false");
            p.put("org.quartz.jobStore.dataSource", "myDS");
            p.put("org.quartz.jobStore.tablePrefix", "QRTZ_");
            p.put("org.quartz.jobStore.isClustered", "true");
            p.put("org.quartz.jobStore.clusterCheckinInterval", "20000");
            p.put("org.quartz.dataSource.myDS.driver", "oracle.jdbc.OracleDriver");
            p.put("org.quartz.dataSource.myDS.maxConnections", "5");
            p.put("org.quartz.dataSource.myDS.validationQuery", "select 0 from dual");
            //ENV specific
            p.put("org.quartz.dataSource.myDS.URL", dbUrl);
            p.put("org.quartz.dataSource.myDS.user", dbUser);
            p.put("org.quartz.dataSource.myDS.password", dbPassword);
            qc.setProperties(p);
            return qc;
        }
    }

    Tips and Tricks

    • The first time you deploy to two servers in the same environment, you will see the tables such as qrtz_job_details and qrtz_cron_triggers are populated. One row for each job/route regardless of deploying twice. Upon redeployment of the same version, the row may be replaced if you change something. Note though that upon deployment of a new version a new row for each job/route will be added. Do not worry, quartz is able to keep track of which row is the most recent version and will only kick off the job once still.
    • Start with ensuring a single local environment works and populates your job store before moving onto clustered environments.
    • Configure your camel quartz component directly to ensure it is getting the correct properties whether through a quartz properties file or through java.util.properties.
    • Clear out the following in eap before deploying this configuration for the first time. jboss-eap/standalone/tmp, jboss-eap/standalone/data, jboss-eap/standalone/logs, as well as the deployment section of standalone.xml or the corresponding folders and config files for whatever configuration you are using.

    Click here and quickly get started with the JBoss EAP download.

    Last updated: May 31, 2024

    Recent Posts

    • Profiling vLLM Inference Server with GPU acceleration on RHEL

    • Network performance in distributed training: Maximizing GPU utilization on OpenShift

    • Clang bytecode interpreter update

    • How Red Hat has redefined continuous performance testing

    • Simplify OpenShift installation in air-gapped environments

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue