Red Hat AMQ

How to integrate a remote Red Hat AMQ 7 cluster on Red Hat JBoss EAP 7

How to integrate a remote Red Hat AMQ 7 cluster on Red Hat JBoss EAP 7

It is very common in an integration landscape to have different components connected using a messaging system such as Red Hat AMQ 7 (RHAMQ 7). In this landscape, usually, there are JEE application servers, such as Red Hat JBoss Enterprise Application Platform 7 (JBoss EAP 7), to deploy and run applications connected to the messaging system.

This article describes in detail how to integrate a remote RHAMQ 7 cluster on a JBoss EAP 7 server, and it covers in detail the different configurations and components and some tips to improve your message-driven beans (MDBs) applications.

Continue reading “How to integrate a remote Red Hat AMQ 7 cluster on Red Hat JBoss EAP 7”

Share
How to run Kafka on Openshift, the enterprise Kubernetes, with AMQ Streams

How to run Kafka on Openshift, the enterprise Kubernetes, with AMQ Streams

On October 25th Red Hat announced the general availability of their AMQ Streams Kubernetes Operator for Apache Kafka. Red Hat AMQ Streams focuses on running Apache Kafka on Openshift providing a massively-scalable, distributed, and high performance data streaming platform. AMQ Streams, based on the Apache Kafka and Strimzi projects, offers a distributed backbone that allows microservices and other applications to share data with extremely high throughput. This backbone enables:

  • Publish and subscribe: Many to many dissemination in a fault tolerant, durable manner.
  • Replayable events: Serves as a repository for microservices to build in-memory copies of source data, up to any point in time.
  • Long-term data retention: Efficiently stores data for immediate access in a manner limited only by disk space.
  • Partition messages for more horizontal scalability: Allows for organizing messages to maximum concurrent access.

One of the most requested items from developers and architects is how to get started with a simple deployment option for testing purposes. In this guide we will use Red Hat Container Development Kit, based on minishift, to start an Apache Kafka cluster on Kubernetes.

Continue reading “How to run Kafka on Openshift, the enterprise Kubernetes, with AMQ Streams”

Share
Logging incoming and outgoing messages for Red Hat AMQ 7

Logging incoming and outgoing messages for Red Hat AMQ 7

In this article, I will discuss how to capture incoming and outgoing messages for Red Hat AMQ 7 (RHAMQ 7). This might advantageous if you need to log the incoming or outgoing traffic, or the messages from a broker, or during development and/or testing when you want to see all message. Additionally, There may also be a need to modify messages in transit. Using RHAMQ 7 interceptors, you can intercept traffic to and from the RHAMQ 7 broker. You can also modify messages using the interceptor.

Continue reading “Logging incoming and outgoing messages for Red Hat AMQ 7”

Share
Welcome Apache Kafka to the Kubernetes Era!

Welcome Apache Kafka to the Kubernetes Era!

We have pretty exciting news this week as Red Hat is announcing the General Availability of their Apache Kafka Kubernetes operator. Red Hat AMQ Streams delivers the mechanisms for managing Apache Kafka on top of OpenShift, our enterprise distribution for Kubernetes.

Everything started last May 2018 when David Ingham (@dingha) unveiled the Developer Preview as new addition to the Red Hat AMQ offering. Red Hat AMQ Streams focuses on running Apache Kafka on OpenShift. In the microservices world, where several components need to rely on a high throughput communication mechanism, Apache Kafka has made a name for itself for being a leading real-time, distributed messaging platform for building data pipelines and streaming applications.

Continue reading “Welcome Apache Kafka to the Kubernetes Era!”

Share
EventFlow: Event-driven microservices on OpenShift (Part 1)

EventFlow: Event-driven microservices on OpenShift (Part 1)

This post is the first in a series of three related posts that describes a lightweight cloud-native distributed microservices framework we have created called EventFlow. EventFlow can be used to develop streaming applications that can process CloudEvents, which are an effort to standardize upon a data format for exchanging information about events generated by cloud platforms.

The EventFlow platform was created to specifically target the Kubernetes/OpenShift platforms, and it models event-processing applications as a connected flow or stream of components. The development of these components can be facilitated through the use of a simple SDK library, or they can be created as Docker images that can be configured using environment variables to attach to Kafka topics and process event data directly.

Continue reading “EventFlow: Event-driven microservices on OpenShift (Part 1)”

Share
How to set up LDAP authentication for the Red Hat AMQ 7 message broker console

How to set up LDAP authentication for the Red Hat AMQ 7 message broker console

This post is a continuation of the series on Red Hat AMQ 7 security topics for developers and ops people started by Mary Cochran.  We will see how to configure LDAP authentication on a Red Hat AMQ 7 broker instance. In order to do so, we will go perform the followings actions:

  • Set up a simple LDAP server with a set of users and groups using Apache Directory Studio.
  • Connect Red Hat AMQ 7 to LDAP using authentication providers.
  • Enable custom LDAP authorization policies in Red Hat AMQ 7.

 

Continue reading “How to set up LDAP authentication for the Red Hat AMQ 7 message broker console”

Share
Asynchronous communication between microservices using AMQP and Vert.x

Asynchronous communication between microservices using AMQP and Vert.x

Microservices are the go-to architecture in most new, modern software solutions. They are (mostly) designed to do one thing, and they must talk to each other to accomplish a business use-case. All communication between the microservices is via network calls; this pattern avoids tight coupling between services and provides better separation between them.

There are basically two styles of communication: synchronous and asynchronous. These two styles applied properly are the foundation for request-reply and event-driven patterns. In the case of the request-reply pattern, a client initiates a request and typically waits synchronously for the reply. However, there are cases where the client could decide not to wait and register a callback with the other party, which is an example of the request-reply pattern in an asynchronous fashion.

In this article, I am showcasing the approach of asynchronous request-reply by having two services communicate with each other over Advanced Message Queuing Protocol (AMQP). AMQP is an open standard for passing business messages between applications or organizations. Although this article focuses on the request-reply pattern, the same code can be used to develop additional scenarios like event sourcing. Communicating using an asynchronous model can be very beneficial for implementing the aggregator pattern.

I will be using Apache QPid Proton (or Red Hat AMQ Interconnect) as the message router and the Vert.x AMQP bridge for communication between the two services.

Continue reading “Asynchronous communication between microservices using AMQP and Vert.x”

Share
How to integrate A-MQ 6.3 on Red Hat JBoss EAP 7

How to integrate A-MQ 6.3 on Red Hat JBoss EAP 7

This article describes in detail how to integrate Red Hat A-MQ 6.3 on Red Hat JBoss Enterprise Application Platform (EAP) 7 and covers in detail the admin-object configuration, especially the pool-name configuration. The attribute pool-name for the admin-object explanation can lead to confusion. In this post, I will try to clarify many of the steps, give an overview of the components, and how they fit together.

The JBoss EAP requires the configuration of a resource adapter as a central component for integration with the A-MQ 6.3. In addition, the MDBs configuration on the EAP is required to enable the JMS consumers. On the A-MQ 6.3, the configuration of the Transport Connectors is required to open the communication channel with the EAP.

All the steps required to configure EAP 7 to use A-MQ 6.3 as an external JMS broker are described here:

Continue reading “How to integrate A-MQ 6.3 on Red Hat JBoss EAP 7”

Share
Setting up RBAC on Red Hat AMQ Broker

Setting up RBAC on Red Hat AMQ Broker

One thing that is common in the enterprise world, especially in highly regulated industries, is to have separation of duties. Role-based access controls (RBAC) have built-in support for separation of duties. Roles determine what operations a user can and cannot perform. This post provides an example of how to configure proper RBAC on top of Red Hat AMQ, a flexible, high-performance messaging platform based on the open source Apache ActiveMQ Artemis project.

In most of the cases, separation of duties on Red Hat AMQ can be divided into three primary roles:

  1. Administrator role, which will have all permissions
  2. Application role, which will have permission to publish, consume, or produce messages to a specific address, subscribe to topics or queues, or create and delete addresses.
  3. Operation role, which will have read-only permission via the web console or supported protocols

To implement those roles, Red Hat AMQ has several security features that need be configured, as described in the following sections.

Continue reading “Setting up RBAC on Red Hat AMQ Broker”

Share
Smart-Meter Data Processing Using Apache Kafka on OpenShift

Smart-Meter Data Processing Using Apache Kafka on OpenShift

There is a major push in the United Kingdom to replace aging mechanical electricity meters with connected smart meters. New meters allow consumers to more closely monitor their energy usage and associated cost, and they enable the suppliers to automate the billing process because the meters automatically report fine-grained energy use.

This post describes an architecture for processing a stream of meter readings using Strimzi, which offers support for running Apache Kafka in a container environment (Red Hat OpenShift). The data has been made available through a UK research project that collected data from energy producers, distributors, and consumers from 2011 to 2014. The TC1a dataset used here contains data from 8,000 domestic customers on half-hour intervals in the following form:

Continue reading “Smart-Meter Data Processing Using Apache Kafka on OpenShift”

Share