Skip to main content
Redhat Developers  Logo
  • Products

    Featured

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat OpenShift AI
      Red Hat OpenShift AI
    • Red Hat Enterprise Linux AI
      Linux icon inside of a brain
    • Image mode for Red Hat Enterprise Linux
      RHEL image mode
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • Red Hat Developer Hub
      Developer Hub
    • View All Red Hat Products
    • Linux

      • Red Hat Enterprise Linux
      • Image mode for Red Hat Enterprise Linux
      • Red Hat Universal Base Images (UBI)
    • Java runtimes & frameworks

      • JBoss Enterprise Application Platform
      • Red Hat build of OpenJDK
    • Kubernetes

      • Red Hat OpenShift
      • Microsoft Azure Red Hat OpenShift
      • Red Hat OpenShift Virtualization
      • Red Hat OpenShift Lightspeed
    • Integration & App Connectivity

      • Red Hat Build of Apache Camel
      • Red Hat Service Interconnect
      • Red Hat Connectivity Link
    • AI/ML

      • Red Hat OpenShift AI
      • Red Hat Enterprise Linux AI
    • Automation

      • Red Hat Ansible Automation Platform
      • Red Hat Ansible Lightspeed
    • Developer tools

      • Red Hat Trusted Software Supply Chain
      • Podman Desktop
      • Red Hat OpenShift Dev Spaces
    • Developer Sandbox

      Developer Sandbox
      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Secure Development & Architectures

      • Security
      • Secure coding
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
      • View All Technologies
    • Start exploring in the Developer Sandbox for free

      sandbox graphic
      Try Red Hat's products and technologies without setup or configuration.
    • Try at no cost
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • Java
      Java icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • API Catalog
    • Product Documentation
    • Legacy Documentation
    • Red Hat Learning

      Learning image
      Boost your technical skills to expert-level with the help of interactive lessons offered by various Red Hat Learning programs.
    • Explore Red Hat Learning
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

How to configure a JDBC Appender for Red Hat Fuse 7 with Karaf

January 23, 2019
Chandra Shekhar Pandey
Related products:
Streams for Apache KafkaRed Hat Fuse

Share:

    For some integration projects it is helpful to persist application logs in a relational database instead of a plain log file. In this article, I show how to configure a JDBC appender for Red Hat Fuse 7 running in an Apache Karaf environment. There are instructions that describe how to persist messagess with PostgreSQL. Instead, I will show how to setup a JDBC appender for Oracle Database 11g.

    I have tested this procedure with Oracle Database 11g Express Edition. A major difference I found is with the table syntax and the fact that the Oracle Database 11g sequence and trigger were required to auto-generate the primary key. Hence, users of Oracle Database 11g should find this article useful.

    1. Create an org.ops4j.datasource-oracleds.cfg file in the FUSE_HOME/etc folder with the following content:

    databaseName=xe
    user=SYSTEM
    password=administrator
    dataSourceName=oracleds
    url=jdbc:oracle:thin:@192.168.1.10:1521/xe
    osgi.jdbc.driver.class=oracle.jdbc.OracleDriver

    2. Install the Oracle driver from the Karaf terminal using the following command:

     osgi:install -s wrap:mvn:com.oracle/ojdbc6/11.2.0.3
    

    3. Install these features:

    feature:install jdbc jndi pax-jdbc-oracle
    

    4. Check service:list for the DataSourceFactory object:

    karaf@root()> service:list DataSourceFactory
    [org.osgi.service.jdbc.DataSourceFactory]
    -----------------------------------------
     osgi.jdbc.driver.class = oracle.jdbc.OracleDriver
     osgi.jdbc.driver.name = wrap_mvn_com.oracle_ojdbc6_11.2.0.3
     osgi.jdbc.driver.version = 0.0.0
     service.bundleid = 223
     service.id = 259
     service.scope = singleton
    Provided by : 
     wrap_mvn_com.oracle_ojdbc6_11.2.0.3 (223)
    Used by: 
     OPS4J Pax JDBC Config (239)
    
    [org.osgi.service.jdbc.DataSourceFactory]
    -----------------------------------------
     osgi.jdbc.driver.class = oracle.jdbc.OracleDriver
     osgi.jdbc.driver.name = oracle
     service.bundleid = 237
     service.id = 246
     service.scope = singleton
    Provided by : 
     OPS4J Pax JDBC Oracle Driver Adapter (237)
    Used by: 
     OPS4J Pax JDBC Config (239)
    

    5. Check service:list for the DataSource object:

     karaf@root()> service:list DataSource
    [javax.sql.DataSource]
    ----------------------
     databaseName = xe
     dataSourceName = oracleds
     felix.fileinstall.filename = file:/home/cpandey/NotBackedUp/Development/RedHat_Fuse_Folder/fuse7_2/fuse-karaf-7.2.0.fuse-720035-redhat-00001/etc/org.ops4j.datasource-oracleds.cfg
     osgi.jdbc.driver.class = oracle.jdbc.OracleDriver
     osgi.jndi.service.name = oracleds
     password = administrator
     pax.jdbc.managed = true
     service.bundleid = 239
     service.factoryPid = org.ops4j.datasource
     service.id = 261
     service.pid = org.ops4j.datasource.a6fadb91-16e5-43bb-8990-bbbaa553be8d
     service.scope = singleton
     url = jdbc:oracle:thin:@192.168.1.10:1521/xe
     user = SYSTEM
    Provided by : 
     OPS4J Pax JDBC Config (239)
    Used by: 
     OPS4J Pax Logging - Log4j v2 (8)
    

    6. Create the event table:

    CREATE TABLE "SYSTEM"."EVENTS" 
       (	"EVENT_ID" NUMBER(10,0) NOT NULL ENABLE, 
    	"EVENT_DATE" TIMESTAMP (0), 
    	"EVENT_LEVEL" VARCHAR2(5), 
    	"EVENT_SOURCE" VARCHAR2(128), 
    	"EVENT_THREAD_ID" VARCHAR2(128), 
    	"EVENT_MESSAGE" VARCHAR2(1024), 
    	 CONSTRAINT "EVENT_ID_PK" PRIMARY KEY ("EVENT_ID"));
    

    The table create statement can also be executed from the Karaf terminal:

    karaf@root()> jdbc:execute oracleds 'CREATE TABLE "EVENTS" ("EVENT_ID" NUMBER(10,0) NOT NULL ENABLE, "EVENT_DATE" TIMESTAMP (0), "EVENT_LEVEL" VARCHAR2(5), "EVENT_SOURCE" VARCHAR2(128), "EVENT_THREAD_ID" VARCHAR2(128), "EVENT_MESSAGE" VARCHAR2(1024), CONSTRAINT "EVENT_ID_PK" PRIMARY KEY ("EVENT_ID"))'
    

    I find doing that way is very useful. It would also test whether the datasource is set correctly or not.

    7. In the database, we need to create a sequence and trigger.

    Sequence:

    create sequence events_seq start with 1 increment by 1;
    

    Trigger:

    create or replace trigger events_seq_tr
     before insert on events for each row
     when (new.event_id is null)
    begin
     select events_seq.nextval into :new.event_id from dual;
    END;
    

    Note the terminating semicolon in the trigger.

    8. In FUSE_HOME/etc/org.ops4j.pax.logging.cfg, set the following:

    log4j2.rootLogger.appenderRef.JdbcAppender.ref = JdbcAppender
    
    log4j2.appender.jdbc.type = JDBC
    log4j2.appender.jdbc.name = JdbcAppender
    log4j2.appender.jdbc.tableName = EVENTS
    log4j2.appender.jdbc.cs.type = DataSource
    log4j2.appender.jdbc.cs.jndiName = osgi:service/oracleds
    log4j2.appender.jdbc.c1.type = Column
    log4j2.appender.jdbc.c1.name = EVENT_DATE
    log4j2.appender.jdbc.c1.isEventTimestamp = true
    log4j2.appender.jdbc.c2.type = Column
    log4j2.appender.jdbc.c2.name = EVENT_LEVEL
    log4j2.appender.jdbc.c2.pattern = %level
    # setNString vs setString
    log4j2.appender.jdbc.c2.isUnicode = false
    log4j2.appender.jdbc.c3.type = Column
    log4j2.appender.jdbc.c3.name = EVENT_SOURCE
    log4j2.appender.jdbc.c3.pattern = %logger
    log4j2.appender.jdbc.c3.isUnicode = false
    log4j2.appender.jdbc.c4.type = Column
    log4j2.appender.jdbc.c4.name = EVENT_THREAD_ID
    log4j2.appender.jdbc.c4.pattern = %thread
    log4j2.appender.jdbc.c4.isUnicode = false
    log4j2.appender.jdbc.c5.type = Column
    log4j2.appender.jdbc.c5.name = EVENT_MESSAGE
    log4j2.appender.jdbc.c5.pattern = %message
    log4j2.appender.jdbc.c5.isUnicode = false
    

    9. Test using this command:

    karaf@root()> log:log "hey test 123"
    

    10. Red Hat Fuse log you would get:

    2019-01-04 19:26:46,098 | INFO  | pipe-log:set INFO    | o.a.k.l.core                     | 135 - org.apache.karaf.log.core - 4.2.0.fuse-720061-redhat-00001 | hey csp 123
    2019-01-04 19:29:18,273 | INFO  | g:log "hey test 123" | o.a.k.l.core                     | 135 - org.apache.karaf.log.core - 4.2.0.fuse-720061-redhat-00001 | hey test 123
    

    11. In the database, execute select * from events;:

    EVENT_ID EVENT_DATE           EVENT_LEVEL   EVENT_SOURCE                 EVENT_THREAD_ID        EVENT_MESSAGE 
    199	2019-01-04 19:25:42	INFO	org.apache.karaf.log.core	pipe-log:set INFO	hey csp
    200	2019-01-04 19:26:08	INFO	org.apache.karaf.log.core	pipe-log:set INFO	hey csp123
    201	2019-01-04 19:26:46	INFO	org.apache.karaf.log.core	pipe-log:set INFO	hey csp 123
    202	2019-01-04 19:29:18	INFO	org.apache.karaf.log.core	pipe-log:log "hey test 123"	hey test 123
    

    That's it. I hope this article will not only help you configure a JDBC appender for persisting logs on Oracle Database 11g but also configure a datasource with Red Hat Fuse 7 in a Karaf environment. It should also help you gain some hands-on experience with Karaf commands.

    Recent Posts

    • Assessing AI for OpenShift operations: Advanced configurations

    • OpenShift Lightspeed: Assessing AI for OpenShift operations

    • OpenShift Data Foundation and HashiCorp Vault securing data

    • Axolotl meets LLM Compressor: Fast, sparse, open

    • What’s new for developers in Red Hat OpenShift 4.19

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue