Red Hat OpenShift

MySQL for developers in Red Hat OpenShift

MySQL for developers in Red Hat OpenShift

As a software developer, it’s often necessary to access a relational database—or any type of database, for that matter. If you’ve been held back by that situation where you need to have someone in operations provision a database for you, then this article will set you free. I’ll show you how to spin up (and wipe out) a MySQL database in seconds using Red Hat OpenShift.

Continue reading “MySQL for developers in Red Hat OpenShift”

Share
An introduction to cloud-native CI/CD with Red Hat OpenShift Pipelines

An introduction to cloud-native CI/CD with Red Hat OpenShift Pipelines

Red Hat OpenShift 4.1 offers a developer preview of OpenShift Pipelines, which enable the creation of cloud-native, Kubernetes-style continuous integration and continuous delivery (CI/CD) pipelines based on the Tekton project. In a recent article on the Red Hat OpenShift blog, I provided an introduction to Tekton and pipeline concepts and described the benefits and features of OpenShift Pipelines.

Continue reading “An introduction to cloud-native CI/CD with Red Hat OpenShift Pipelines”

Share
Using a custom builder image on Red Hat OpenShift with OpenShift Do

Using a custom builder image on Red Hat OpenShift with OpenShift Do

One of the things I enjoy most about using Red Hat OpenShift is the Developer Catalog. The Developer Catalog is a central location where a team working with Red Hat OpenShift can encapsulate and share how application components and services are deployed.

The Developer Catalog is often used to define an infrastructure pattern referred to as a builder image. A builder image is a container image that supports a particular language or framework, following best practices and Source-to-Image (s2i) specifications.

The OpenShift Developer Catalog provides several standard builder images supporting applications written in Node.js, Ruby, Python, and more. And while the Developer Catalog has many easy ways to get started deploying several supported languages, the catalog is also flexible in allowing you to add your own builder images to support an infrastructure pattern that is not preloaded in the catalog.

Continue reading “Using a custom builder image on Red Hat OpenShift with OpenShift Do”

Share
Developer preview of Debezium Apache Kafka connectors for Change Data Capture (CDC)

Developer preview of Debezium Apache Kafka connectors for Change Data Capture (CDC)

With the release of Red Hat AMQ Streams 1.2, Red Hat Integration now includes a developer preview of Change Data Capture (CDC) capabilities to enable data integration for modern cloud-native microservices-based applications. CDC features are based on the upstream project Debezium and are natively integrated with Apache Kafka and Strimzi to run on top of Red Hat OpenShift Container Platform, the enterprise Kubernetes, as part of the AMQ Streams release.

Continue reading “Developer preview of Debezium Apache Kafka connectors for Change Data Capture (CDC)”

Share
Accessing Apache Kafka in Strimzi: Part 5 – Ingress

Accessing Apache Kafka in Strimzi: Part 5 – Ingress

In the fifth and final part of this series, we will look at exposing Apache Kafka in Strimzi using Kubernetes Ingress. This article will explain how to use Ingress controllers on Kubernetes, how Ingress compares with Red Hat OpenShift routes, and how it can be used with Strimzi and Kafka. Off-cluster access using Kubernetes Ingress is available only from Strimzi 0.12.0. (Links to previous articles in the series can be found at the end.)

Continue reading “Accessing Apache Kafka in Strimzi: Part 5 – Ingress”

Share
Accessing Apache Kafka in Strimzi: Part 4 – Load balancers

Accessing Apache Kafka in Strimzi: Part 4 – Load balancers

In this fourth article of our series about accessing Apache Kafka clusters in Strimzi, we will look at exposing Kafka brokers using load balancers. (See links to previous articles at end.) This article will explain how to use load balancers in public cloud environments and how they can be used with Apache Kafka.

Continue reading “Accessing Apache Kafka in Strimzi: Part 4 – Load balancers”

Share