Red Hat Integration

Understanding Red Hat AMQ Streams components for OpenShift and Kubernetes: Part 3

Understanding Red Hat AMQ Streams components for OpenShift and Kubernetes: Part 3

In the previous articles in this series, we first covered the basics of Red Hat AMQ Streams on OpenShift and then showed how to set up Kafka Connect, a Kafka Bridge, and Kafka Mirror Maker. Here are a few key points to keep in mind before we proceed:

  • AMQ Streams is based on Apache Kafka.
  • AMQ Streams for the Red Hat OpenShift Container Platform is based on the Strimzi project.
  • AMQ Streams on containers has multiple components, such as the Cluster Operator, Entity Operator, Mirror Maker, Kafka connect, and Kafka Bridge.

Now that we have everything set up (or so we think), let’s look at monitoring and alerting for our new environment.

Continue reading “Understanding Red Hat AMQ Streams components for OpenShift and Kubernetes: Part 3”

Share
Understanding Red Hat AMQ Streams components for OpenShift and Kubernetes: Part 1

Understanding Red Hat AMQ Streams components for OpenShift and Kubernetes: Part 1

Red Hat AMQ Streams is an enterprise-grade Apache Kafka (event streaming) solution, which enables systems to exchange data at high throughput and low latency. AMQ Streams is available as part of the Red Hat AMQ offering in two different flavors: one on the Red Hat Enterprise Linux platform and another on the OpenShift Container Platform. In this three-part article series, we will cover AMQ Streams on the OpenShift Container Platform.

To get the most out of these articles, it will help to be familiar with messaging concepts, Red Hat OpenShift, and Kubernetes.

Continue reading “Understanding Red Hat AMQ Streams components for OpenShift and Kubernetes: Part 1”

Share
APIs as a Product: Get the value out of your APIs

APIs as a Product: Get the value out of your APIs

APIs continue to spread, as seen in this 2019 report from ProgrammableWeb, which shows a 30% increase over last year’s growth rate. More regulations are enforcing the use of APIs to open up companies and foster innovation. Think of the Payment Services Directive version two (PSD2), open banking, and the public sector releasing 0pen data APIs. With such an abundance of APIs, it becomes increasingly crucial to get the value out of your APIs and differentiate yourself from the growing competition. It’s time to design and manage your APIs as a Product.

Continue reading APIs as a Product: Get the value out of your APIs

Share
Red Hat simplifies transition to open source Kafka with new service registry and HTTP bridge

Red Hat simplifies transition to open source Kafka with new service registry and HTTP bridge

Red Hat continues to increase the features available for users looking to implement a 100% open source, event-driven architecture (EDA) through running Apache Kafka on Red Hat OpenShift and Red Hat Enterprise Linux. The Red Hat Integration Q4 release provides new features and capabilities, including ones aimed at simplifying usage and deployment of the AMQ streams distribution of Apache Kafka. 

Continue reading “Red Hat simplifies transition to open source Kafka with new service registry and HTTP bridge”

Share
A look at development environments with specific tooling for Apache Camel Language

A look at development environments with specific tooling for Apache Camel Language

A growing set of editors and IDEs provides specific tooling for development of applications based on Apache Camel. Historically, there was only Eclipse Fuse Tooling, which was based on the Eclipse Desktop IDE. Then, an IntelliJ plugin was created. Both of these tools are tightly coupled to the specific IDE APIs. Consequently, they have the drawback of not easily sharing the development effort.

Supported editors and IDEs

Thanks to Language Server Protocol, with a core server and several configurations or small client development, Apache Camel Language can now be enjoyed on a growing set of environments:

Continue reading “A look at development environments with specific tooling for Apache Camel Language”

Share
CDC pipeline with Red Hat AMQ Streams and Red Hat Fuse

CDC pipeline with Red Hat AMQ Streams and Red Hat Fuse

Change Data Capture (CDC) is a pattern that enables database changes to be monitored and propagated to downstream systems. It is an effective way of enabling reliable microservices integration and solving typical challenges, such as gradually extracting microservices from existing monoliths.

With the release of Red Hat AMQ Streams 1.2, Red Hat Integration now includes a developer preview of CDC features based on upstream project Debezium.

This article explains how to make use of Red Hat Integration to create a complete CDC pipeline. The idea is to enable applications to respond almost immediately whenever there is a data change. We capture the changes as they occur using Debezium and stream it using Red Hat AMQ Streams. We then filter and transform the data using Red Hat Fuse and send it to Elasticsearch, where the data can be further analyzed or used by downstream systems.

Continue reading “CDC pipeline with Red Hat AMQ Streams and Red Hat Fuse”

Share
Using the 3scale toolbox Jenkins Shared Library

Using the 3scale toolbox Jenkins Shared Library

In the previous article of this series, Deploy your API from a Jenkins Pipeline, we discovered how the 3scale toolbox can help you deploy your API from a Jenkins Pipeline on Red Hat OpenShift/Kubernetes. In this article, we will improve the pipeline from the previous article to make it more robust, less verbose, and also offer more features by using the 3scale toolbox Jenkins Shared Library.

Continue reading “Using the 3scale toolbox Jenkins Shared Library”

Share
Deploy your API from a Jenkins Pipeline

Deploy your API from a Jenkins Pipeline

In a previous article, 5 principles for deploying your API from a CI/CD pipeline, we discovered the main steps required to deploy your API from a CI/CD pipeline and this can prove to be a tremendous amount of work. Hopefully, the latest release of Red Hat Integration greatly improved this situation by adding new capabilities to the 3scale CLI. In 3scale toolbox: Deploy an API from the CLI, we discovered how the 3scale toolbox strives to automate the delivery of APIs. In this article, we will discuss how the 3scale toolbox can help you deploy your API from a Jenkins pipeline on Red Hat OpenShift/Kubernetes.

Continue reading “Deploy your API from a Jenkins Pipeline”

Share
3scale toolbox: Deploy an API from the CLI

3scale toolbox: Deploy an API from the CLI

Deploying your API from a CI/CD pipeline can be a tremendous amount of work. The latest release of Red Hat Integration greatly improved this situation by adding new capabilities to the 3scale CLI. The 3scale CLI is named 3scale toolbox and strives to help API administrators to operate their services as well as automate the delivery of their API through Continuous Delivery pipelines.

Having a standard CLI is a great advantage for our customers since they can use it in the CI/CD solution of their choice (Jenkins, GitLab CI, Ansible, Tekton, etc.). It is also a means for Red Hat to capture customer needs as much as possible and offer the same feature set to all our customers.

Continue reading “3scale toolbox: Deploy an API from the CLI”

Share