Shane Boulden

Shane is a Red Hat Solutions Architect, specialising in application development, automation and DevOps practices. He enjoys good coffee and a well-developed RESTful API.

Recent Posts

Use the Kubernetes Python client from your running Red Hat OpenShift pods

Use the Kubernetes Python client from your running Red Hat OpenShift pods

Red Hat OpenShift is part of the Cloud Native Computing Foundation (CNCF) Certified Program, ensuring portability and interoperability for your container workloads. This also allows you to use Kubernetes tools to interact with an OpenShift cluster, like kubectl, and you can rest assured that all the APIs you know and love are right there at your fingertips.

The Kubernetes Python client is another great tool for interacting with an OpenShift cluster, allowing you to perform actions on Kubernetes resources with Python code. It also has applications within a cluster. We can configure a Python application running on OpenShift to consume the OpenShift API, and list and create resources. We could then create containerized batch jobs from the running application, or a custom service monitor, for example. It sounds a bit like “OpenShift inception,” using the OpenShift API from services created using the OpenShift API.

In this article, we’ll create a Flask application running on OpenShift. This application will use the Kubernetes Python client to interact with the OpenShift API, list other pods in the project, and display them back to the user.

Continue reading “Use the Kubernetes Python client from your running Red Hat OpenShift pods”

Share
Creating a containerized Python/Flask development environment with Red Hat CodeReady Workspaces

Creating a containerized Python/Flask development environment with Red Hat CodeReady Workspaces

Red Hat CodeReady Workspaces provide developers with containerized development environments hosted on OpenShift/Kubernetes. DevOps teams can now use a hosted development environment that’s pre-built for their chosen stack and customized for their project.

CodeReady Workspaces can help you rapidly onboard developers for your project as everything they need to develop is running in a containerized workspace. In this post, we’re going to use CodeReady Workspaces to get up and running quickly with an existing open source project, Peak. Peak is a multi-container Kubernetes application for performance testing web services, and it allows you to create distributed performance tests using the Kubernetes Batch API for test orchestration. We’ll make some modifications to Peak’s Flask front end, a stateless web interface that interacts with a Falcon RESTful API to return data about performance tests. You won’t need the complete Peak application deployed, though if you like, you can find steps to deploy it to OpenShift here.

To follow along you’ll need a Red Hat OpenShift Container Platform 3.11 environment. You can use the Red Hat Container Development Kit on your Windows, macOS, or Linux laptop or a hosted Red Hat OpenShift instance to do it on online.

Continue reading “Creating a containerized Python/Flask development environment with Red Hat CodeReady Workspaces”

Share
Create a scalable REST API with Falcon and RHSCL

Create a scalable REST API with Falcon and RHSCL

APIs are critical to automation, integration and developing cloud-native applications, and it’s vital they can be scaled to meet the demands of your user-base. In this article, we’ll create a database-backed REST API based on the Python Falcon framework using Red Hat Software Collections (RHSCL), test how it performs, and scale-out in response to a growing user-base.

Continue reading Create a scalable REST API with Falcon and RHSCL

Share
Using API keys securely in your OpenShift microservices and applications

Using API keys securely in your OpenShift microservices and applications

In the microservices landscape, the API provides an essential form of communication between components. To allow secure communication between microservices components, as well as third-party applications, it’s important to be able to consume API keys and other sensitive data in a manner that doesn’t place the data at risk. Secret objects are specifically designed to hold sensitive information, and OpenShift makes exposing this information to the applications that need it easy.

In this post, I’ll demonstrate securely consuming API keys in OpenShift Enterprise 3. We’ll deploy a Sinatra application that uses environment variables to interact with the Twitter API; create a Kubernetes Secret object to store the API keys; expose the secret to the application via environment variables; and then perform a rolling update of the environment variables across our pods.

Continue reading “Using API keys securely in your OpenShift microservices and applications”

Share