CI/CD

Building modern CI/CD workflows for serverless applications with Red Hat OpenShift Pipelines and Argo CD, Part 1

Building modern CI/CD workflows for serverless applications with Red Hat OpenShift Pipelines and Argo CD, Part 1

A recent article, The present and future of CI/CD with GitOps on Red Hat OpenShift, proposed Tekton as a framework for cloud-native CI/CD pipelines, and Argo CD as its perfect partner for GitOps. GitOps practices support continuous delivery in hybrid, multi-cluster Kubernetes environments.

Continue reading Building modern CI/CD workflows for serverless applications with Red Hat OpenShift Pipelines and Argo CD, Part 1

Share
AI software stack inspection with Thoth and TensorFlow

AI software stack inspection with Thoth and TensorFlow

Project Thoth develops open source tools that enhance the day-to-day life of developers and data scientists. Thoth uses machine-generated knowledge to boost the performance, security, and quality of your applications using artificial intelligence (AI) through reinforcement learning (RL). This machine-learning approach is implemented in Thoth adviser (if you want to know more, click here) and it is used by Thoth integrations to provide the software stack based on user inputs.

Continue reading AI software stack inspection with Thoth and TensorFlow

Share
Set up continuous integration for .NET Core with OpenShift Pipelines

Set up continuous integration for .NET Core with OpenShift Pipelines

Have you ever wanted to set up continuous integration (CI) for .NET Core in a cloud-native way, but you didn’t know where to start? This article provides an overview, examples, and suggestions for developers who want to get started setting up a functioning cloud-native CI system for .NET Core.

We will use the new Red Hat OpenShift Pipelines feature to implement .NET Core CI. OpenShift Pipelines are based on the open source Tekton project. OpenShift Pipelines provide a cloud-native way to define a pipeline to build, test, deploy, and roll out your applications in a continuous integration workflow.

In this article, you will learn how to:

  1. Set up a simple .NET Core application.
  2. Install OpenShift Pipelines on Red Hat OpenShift.
  3. Create a simple pipeline manually.
  4. Create a Source-to-Image (S2I)-based pipeline.

Continue reading “Set up continuous integration for .NET Core with OpenShift Pipelines”

Share
Kubeflow 1.0 monitoring and enhanced JupyterHub builds in Open Data Hub 0.8

Kubeflow 1.0 monitoring and enhanced JupyterHub builds in Open Data Hub 0.8

The new Open Data Hub version 0.8 (ODH) release includes many new features, continuous integration (CI) additions, and documentation updates. For this release, we focused on enhancing JupyterHub image builds, enabling more mixing of Open Data Hub and Kubeflow components, and designing our comprehensive end-to-end continuous integration and continuous deployment and delivery (CI/CD) process. In this article, we introduce the highlights of this newest release.

Note: Open Data Hub is an open source project and a community Operator for building an AI-as-a-Service (AIaaS) platform on Red Hat OpenShift.

Continue reading “Kubeflow 1.0 monitoring and enhanced JupyterHub builds in Open Data Hub 0.8”

Share
From code to production with OpenShift Pipelines and Argo CD

From code to production with OpenShift Pipelines and Argo CD

Our team is responsible for a small GoLang application. The application’s developers are continuously sending code changes to the main branch, so for the past two years, our team has used GitOps for continuous integration (CI). We started out using GitOps to deploy applications to our test clusters; then, we began using it to run day two operations in our clusters.

Continue reading From code to production with OpenShift Pipelines and Argo CD

Share
The present and future of CI/CD with GitOps on Red Hat OpenShift

The present and future of CI/CD with GitOps on Red Hat OpenShift

The need to deliver applications faster is near-universal, even in organizations that traditionally are perceived as risk-averse. As the foundations of DevOps, continuous integration (CI) and continuous delivery (CD) are essential to application delivery in most organizations. Together, CI/CD tools and processes automate building and testing applications on every code or configuration change, then trigger a sequence of workflows that deliver the application to production.

Continue reading The present and future of CI/CD with GitOps on Red Hat OpenShift

Share
Introduction to Tekton and Argo CD for multicluster development

Introduction to Tekton and Argo CD for multicluster development

Over the last two years, my coworkers and I have worked on developing a multicluster project for Kubernetes and Red Hat OpenShift. We needed a way to efficiently deploy applications, oversee access and authorization, and manage application placement across clusters. This need led us to develop with Argo CD and GitOps.

Recently, I switched to another team that also focuses on multicluster development. During my interviews, I promised to help create a catalog of our projects and develop a process to deploy them rapidly. Together, the catalog and process would allow the team to just work on things, rather than trying to figure out how to get them operational. However, I quickly hit a wall. With Argo CD, I couldn’t control when and in what order cluster objects were deployed onto new or existing clusters. Eventually, I discovered Tekton, a powerful addition to my development toolset.

In this article, I briefly describe my process for developing the catalog and process tool. I’ll introduce the components involved, explain a little about how Tekton Pipelines works, and leave you with a tool that you can share with your organization and teams.

Continue reading “Introduction to Tekton and Argo CD for multicluster development”

Share
Kubernetes-native Apache Kafka with Strimzi, Debezium, and Apache Camel (Kafka Summit 2020)

Kubernetes-native Apache Kafka with Strimzi, Debezium, and Apache Camel (Kafka Summit 2020)

Apache Kafka has become the leading platform for building real-time data pipelines. Today, Kafka is heavily used for developing event-driven applications, where it lets services communicate with each other through events. Using Kubernetes for this type of workload requires adding specialized components such as Kubernetes Operators and connectors to bridge the rest of your systems and applications to the Kafka ecosystem.

In this article, we’ll look at how the open source projects Strimzi, Debezium, and Apache Camel integrate with Kafka to speed up critical areas of Kubernetes-native development.

Note: Red Hat is sponsoring the Kafka Summit 2020 virtual conference from August 24-25, 2020. See the end of this article for details.

Continue reading “Kubernetes-native Apache Kafka with Strimzi, Debezium, and Apache Camel (Kafka Summit 2020)”

Share
OpenShift 4.5: Bringing developers joy with Kubernetes 1.18 and so much more

OpenShift 4.5: Bringing developers joy with Kubernetes 1.18 and so much more

Since the first Red Hat OpenShift release in 2015, Red Hat has put out numerous releases based on Kubernetes. Five years later, Kubernetes is celebrating its sixth birthday, and last month, we announced the general availability of Red Hat OpenShift Container Platform 4.5. In this article, I offer a high-level view of the latest OpenShift release and its technology and feature updates based on Kubernetes 1.18.

Continue reading OpenShift 4.5: Bringing developers joy with Kubernetes 1.18 and so much more

Share