Red Hat Fuse

For some integration projects it is helpful to persist application logs in a relational database instead of a plain log file. In this article, I show how to configure a JDBC appender for Red Hat Fuse 7 running in an Apache Karaf environment. There are instructions that describe how to persist messagess with PostgreSQL. Instead, I will show how to setup a JDBC appender for Oracle Database 11g.

I have tested this procedure with Oracle Database 11g Express Edition. A major difference I found is with the table syntax and the fact that the Oracle Database 11g sequence and trigger were required to auto-generate the primary key. Hence, users of Oracle Database 11g should find this article useful.

1. Create an org.ops4j.datasource-oracleds.cfg file in the FUSE_HOME/etc folder with the following content:

databaseName=xe
user=SYSTEM
password=administrator
dataSourceName=oracleds
url=jdbc:oracle:thin:@192.168.1.10:1521/xe
osgi.jdbc.driver.class=oracle.jdbc.OracleDriver

2. Install the Oracle driver from the Karaf terminal using the following command:

 osgi:install -s wrap:mvn:com.oracle/ojdbc6/11.2.0.3

3. Install these features:

feature:install jdbc jndi pax-jdbc-oracle

4. Check service:list for the DataSourceFactory object:

karaf@root()> service:list DataSourceFactory
[org.osgi.service.jdbc.DataSourceFactory]
-----------------------------------------
 osgi.jdbc.driver.class = oracle.jdbc.OracleDriver
 osgi.jdbc.driver.name = wrap_mvn_com.oracle_ojdbc6_11.2.0.3
 osgi.jdbc.driver.version = 0.0.0
 service.bundleid = 223
 service.id = 259
 service.scope = singleton
Provided by : 
 wrap_mvn_com.oracle_ojdbc6_11.2.0.3 (223)
Used by: 
 OPS4J Pax JDBC Config (239)

[org.osgi.service.jdbc.DataSourceFactory]
-----------------------------------------
 osgi.jdbc.driver.class = oracle.jdbc.OracleDriver
 osgi.jdbc.driver.name = oracle
 service.bundleid = 237
 service.id = 246
 service.scope = singleton
Provided by : 
 OPS4J Pax JDBC Oracle Driver Adapter (237)
Used by: 
 OPS4J Pax JDBC Config (239)

5. Check service:list for the DataSource object:

 karaf@root()> service:list DataSource
[javax.sql.DataSource]
----------------------
 databaseName = xe
 dataSourceName = oracleds
 felix.fileinstall.filename = file:/home/cpandey/NotBackedUp/Development/RedHat_Fuse_Folder/fuse7_2/fuse-karaf-7.2.0.fuse-720035-redhat-00001/etc/org.ops4j.datasource-oracleds.cfg
 osgi.jdbc.driver.class = oracle.jdbc.OracleDriver
 osgi.jndi.service.name = oracleds
 password = administrator
 pax.jdbc.managed = true
 service.bundleid = 239
 service.factoryPid = org.ops4j.datasource
 service.id = 261
 service.pid = org.ops4j.datasource.a6fadb91-16e5-43bb-8990-bbbaa553be8d
 service.scope = singleton
 url = jdbc:oracle:thin:@192.168.1.10:1521/xe
 user = SYSTEM
Provided by : 
 OPS4J Pax JDBC Config (239)
Used by: 
 OPS4J Pax Logging - Log4j v2 (8)

6. Create the event table:

CREATE TABLE "SYSTEM"."EVENTS" 
   (	"EVENT_ID" NUMBER(10,0) NOT NULL ENABLE, 
	"EVENT_DATE" TIMESTAMP (0), 
	"EVENT_LEVEL" VARCHAR2(5), 
	"EVENT_SOURCE" VARCHAR2(128), 
	"EVENT_THREAD_ID" VARCHAR2(128), 
	"EVENT_MESSAGE" VARCHAR2(1024), 
	 CONSTRAINT "EVENT_ID_PK" PRIMARY KEY ("EVENT_ID"));

The table create statement can also be executed from the Karaf terminal:

karaf@root()> jdbc:execute oracleds 'CREATE TABLE "EVENTS" ("EVENT_ID" NUMBER(10,0) NOT NULL ENABLE, "EVENT_DATE" TIMESTAMP (0), "EVENT_LEVEL" VARCHAR2(5), "EVENT_SOURCE" VARCHAR2(128), "EVENT_THREAD_ID" VARCHAR2(128), "EVENT_MESSAGE" VARCHAR2(1024), CONSTRAINT "EVENT_ID_PK" PRIMARY KEY ("EVENT_ID"))'

I find doing that way is very useful. It would also test whether the datasource is set correctly or not.

7. In the database, we need to create a sequence and trigger.

Sequence:

create sequence events_seq start with 1 increment by 1;

Trigger:

create or replace trigger events_seq_tr
 before insert on events for each row
 when (new.event_id is null)
begin
 select events_seq.nextval into :new.event_id from dual;
END;

Note the terminating semicolon in the trigger.

8. In FUSE_HOME/etc/org.ops4j.pax.logging.cfg, set the following:

log4j2.rootLogger.appenderRef.JdbcAppender.ref = JdbcAppender

log4j2.appender.jdbc.type = JDBC
log4j2.appender.jdbc.name = JdbcAppender
log4j2.appender.jdbc.tableName = EVENTS
log4j2.appender.jdbc.cs.type = DataSource
log4j2.appender.jdbc.cs.jndiName = osgi:service/oracleds
log4j2.appender.jdbc.c1.type = Column
log4j2.appender.jdbc.c1.name = EVENT_DATE
log4j2.appender.jdbc.c1.isEventTimestamp = true
log4j2.appender.jdbc.c2.type = Column
log4j2.appender.jdbc.c2.name = EVENT_LEVEL
log4j2.appender.jdbc.c2.pattern = %level
# setNString vs setString
log4j2.appender.jdbc.c2.isUnicode = false
log4j2.appender.jdbc.c3.type = Column
log4j2.appender.jdbc.c3.name = EVENT_SOURCE
log4j2.appender.jdbc.c3.pattern = %logger
log4j2.appender.jdbc.c3.isUnicode = false
log4j2.appender.jdbc.c4.type = Column
log4j2.appender.jdbc.c4.name = EVENT_THREAD_ID
log4j2.appender.jdbc.c4.pattern = %thread
log4j2.appender.jdbc.c4.isUnicode = false
log4j2.appender.jdbc.c5.type = Column
log4j2.appender.jdbc.c5.name = EVENT_MESSAGE
log4j2.appender.jdbc.c5.pattern = %message
log4j2.appender.jdbc.c5.isUnicode = false

9. Test using this command:

karaf@root()> log:log "hey test 123"

10. Red Hat Fuse log you would get:

2019-01-04 19:26:46,098 | INFO  | pipe-log:set INFO    | o.a.k.l.core                     | 135 - org.apache.karaf.log.core - 4.2.0.fuse-720061-redhat-00001 | hey csp 123
2019-01-04 19:29:18,273 | INFO  | g:log "hey test 123" | o.a.k.l.core                     | 135 - org.apache.karaf.log.core - 4.2.0.fuse-720061-redhat-00001 | hey test 123

11. In the database, execute select * from events;:

EVENT_ID EVENT_DATE           EVENT_LEVEL   EVENT_SOURCE                 EVENT_THREAD_ID        EVENT_MESSAGE 
199	2019-01-04 19:25:42	INFO	org.apache.karaf.log.core	pipe-log:set INFO	hey csp
200	2019-01-04 19:26:08	INFO	org.apache.karaf.log.core	pipe-log:set INFO	hey csp123
201	2019-01-04 19:26:46	INFO	org.apache.karaf.log.core	pipe-log:set INFO	hey csp 123
202	2019-01-04 19:29:18	INFO	org.apache.karaf.log.core	pipe-log:log "hey test 123"	hey test 123

That's it. I hope this article will not only help you configure a JDBC appender for persisting logs on Oracle Database 11g but also configure a datasource with Red Hat Fuse 7 in a Karaf environment. It should also help you gain some hands-on experience with Karaf commands.