Red Hat OpenShift Container Platform

What’s new in the OpenShift 4.3 console developer experience

What’s new in the OpenShift 4.3 console developer experience

The developer experience is significantly improved in the Red Hat OpenShift 4.3 web console. If you have used the Developer perspective, which was introduced in OpenShift 4.2 Console, you are probably familiar with our streamlined user flows for deploying applications, the new Topology view, and the enhanced experience around OpenShift Pipelines powered by Tekton and OpenShift Serverless powered by Knative. This release continues to improve upon the features that were introduced in 4.2 and introduces new flows and features for the developer.

Continue reading What’s new in the OpenShift 4.3 console developer experience

Share
How to install Red Hat OpenShift 3.11 on OpenStack 13

How to install Red Hat OpenShift 3.11 on OpenStack 13

Red Hat OpenShift Container Platform is a platform-as-a-service (PaaS). It orchestrates and manages containerized applications through Kubernetes. Although OpenShift Container Platform supports cloud-native applications, it also supports custom-built applications. OpenShift Container Platform can run on a hybrid cloud configuration providing the flexibility to expand and grow.

Red Hat OpenStack Platform is an infrastructure-as-a-service (IaaS). This means it is a cloud-based platform that provides virtual servers and other resources. Users either manage it through a web-based dashboard, through command-line tools, or through RESTful web services.

If you are considering Red Hat OpenShift Container Platform on OpenStack Platform, there are several advantages, including easily increasing the number of compute nodes and using dynamic storage.

In this article, I will outline the main points required to successfully install Red Hat OpenShift Container Platform on OpenStack Platform. Because my OpenStack knowledge is limited, I reached out to my colleagues for help and will not address too many OpenStack technical details here.

Continue reading “How to install Red Hat OpenShift 3.11 on OpenStack 13”

Share
Full API lifecycle management: A primer

Full API lifecycle management: A primer

APIs are the cornerstone of so many recent breakthroughs: from mobile applications, to the Internet of Things, to cloud computing. All those technologies expose, consume, and are built on APIs. And those APIs are a key driver for generating new revenue. Salesforce generates 50% of its revenue through APIs, Expedia generates 90% of its, and eBay generates 60% of its. With APIs becoming so central, it becomes essential to deal with full API lifecycle management. The success of your digital transformation project depends on it!

This article describes a set of full API lifecycle management activities that can guide you from an idea to the realization, from the inception of an API program up to management at scale throughout your whole company.

Continue reading “Full API lifecycle management: A primer”

Share
Integration of storage services (part 6)

Integration of storage services (part 6)

In Part 5 this series, we looked into details that determine how your integration becomes the key to transforming your customer experience.

It started with laying out the process of how I’ve approached the use case by researching successful customer portfolio solutions as the basis for a generic architectural blueprint. Now it’s time to cover various blueprint details.

This article covers the final elements in the blueprint, storage services, which are fundamental to the generic architectural overview.

Continue reading “Integration of storage services (part 6)”

Share
Using a local NuGet server with Red Hat OpenShift

Using a local NuGet server with Red Hat OpenShift

NuGet is the .NET package manager. By default, the .NET Core SDK will use packages from the nuget.org website.

In this article, you’ll learn how to deploy a NuGet server on Red Hat OpenShift Container Platform (RHOCP). We’ll use it as a caching server and see that it speeds up our builds. Before we get to that, we’ll explore some general NuGet concepts and see why it makes sense to use a local NuGet server.

Continue reading “Using a local NuGet server with Red Hat OpenShift”

Share
Integration of container platform essentials (Part 5)

Integration of container platform essentials (Part 5)

In Part 4 of this series, we looked into details that determine how your integration becomes the key to transforming your omnichannel customer experience.

It started with laying out the process of how I’ve approached the use case by researching successful customer portfolio solutions as the basis for a generic architectural blueprint. Now it’s time to cover more blueprint details.

This article discusses the core elements in the blueprint (container platform and microservices) that are crucial to the generic architectural overview.

Continue reading “Integration of container platform essentials (Part 5)”

Share
Building .NET Core container images using S2I

Building .NET Core container images using S2I

Red Hat OpenShift implements .NET Core support via a source-to-image (S2I) builder. In this article, we’ll take a closer look at how you can use that builder directly. Using S2I, you can build .NET Core application images without having to write custom build scripts or Dockerfiles. This can be useful on your development machine or as part of a CI/CD pipeline.

Continue reading “Building .NET Core container images using S2I”

Share
How to run Kafka on Openshift, the enterprise Kubernetes, with AMQ Streams

How to run Kafka on Openshift, the enterprise Kubernetes, with AMQ Streams

On October 25th Red Hat announced the general availability of their AMQ Streams Kubernetes Operator for Apache Kafka. Red Hat AMQ Streams focuses on running Apache Kafka on Openshift providing a massively-scalable, distributed, and high performance data streaming platform. AMQ Streams, based on the Apache Kafka and Strimzi projects, offers a distributed backbone that allows microservices and other applications to share data with extremely high throughput. This backbone enables:

  • Publish and subscribe: Many to many dissemination in a fault tolerant, durable manner.
  • Replayable events: Serves as a repository for microservices to build in-memory copies of source data, up to any point in time.
  • Long-term data retention: Efficiently stores data for immediate access in a manner limited only by disk space.
  • Partition messages for more horizontal scalability: Allows for organizing messages to maximum concurrent access.

One of the most requested items from developers and architects is how to get started with a simple deployment option for testing purposes. In this guide we will use Red Hat Container Development Kit, based on minishift, to start an Apache Kafka cluster on Kubernetes.

Continue reading “How to run Kafka on Openshift, the enterprise Kubernetes, with AMQ Streams”

Share