Background
Last year I was lucky enough to be given the opportunity to speak at Red Hat Summit about Software Collections. As I was doing research for my presentation it became abundantly clear that my life, as a systems admin, would have been light years better if the tool set would have been available earlier on in my career.
Besides the already explained benefits in an couple other blog posts on this site, namely the ability to install and run multiple versions of widely available applications and programming interpreters/compilers, the software collections utilities have so many other use cases. I felt this tool set fixes another major issue faced by RHEL system admins, namely the ability to manage and upgrade your internally developed applications.
Before coming to Red Hat I worked in a couple different roles as a Linux system admin. While in these roles, one of my most dreaded days on the job would be when the change request ticket would come across my desk with the words "Application Upgrade" somewhere in the title. To me this undoubtedly meant a couple different planning meetings, a lot of staging work, and at least one late night trip back into the office.
The one common thread between all my roles, and all these application upgrades, was the fact each were unique and each involved a lot of manual process. As much as I tried to talk my coworkers into working with me to develop rpm based deployments of their applications, there was always objections and most of the time the objections made sense.
The two most common concerns brought up were the fact the upgrade would be a single atomic action. Once the rpm was created, there wasn't an easy way to test parts of the upgrade process without doing an upgrade/install of the rpm. The other concern was the concern around fail back. Once we installed/upgraded the rpm package there was always concern the fail back wouldn't be as clean as expected. As much as I pushed toward rpm based packages there was always someone touting the ease and stability of the manual process.
Enter Software Collections
During my research on what makes up a software collection, I stumbled upon the building blocks of the technology and I got really excited. Much to my surprise, the software collections tool set addressed the concerns that had plagued me before. The more I researched the more I got excited about this great technology.
First, software collections build their own mock RHEL file system under the /opt/provider/software_collections directory. If you dive into a software collection already installed on one of your existing system you will see what I am talking about. So each install of an application version could be totally separated from the next. This removes the concern that an RPM upgrade process would step all over an existing working version of the application. File system structures can be installed side by side and not affect each other.
Second, the ability to install and stage a new version of the application. A software collection can be installed side-by-side with the currently running version of the application. This allows the developer and system admin to do a lot of the upgrade work during normal business hours. The work then done during the change window, which is usually at odd hours of the night, can be as easy as shutting down the current version of the software collection, invoking the new software collection which is already installed, and starting up the app. If a fail back is needed, the steps are repeated in reverse. There is no need to uninstall/reinstall a certain version of the application and it's dependencies, as they are already there in the previous version of the software collection.
Docker is cool and Software Collections aren't old school
When I talk with Red Hat users these days the conversations inevitability steers towards Docker/Linux containers. It is the latest and greatest new technology these days and quickly seems to be taking the lead spot from "cloud" for technical related conversations. A lot of people I talk to believe that software collections live in that old school world of IT and Linux containers already solve all the problems that software collections try to address. That couldn't be farther from the truth.
A lot of people I talk with are in the process of evaluating Linux containers for use in their environment. It seems that everyone is in varying stages of adopting this as the next major iteration of maintaining their software life cycle, but I am finding that not a whole lot of teams are ready to use Linux containers in production. This a great place for software collections to assist. The software collection utilities are the that perfect solution to address the needs of today and then help a IT team migrate into the brave new world of Linux container deployments.
The software collections framework described above fits into a lot of IT departments current process and procedures and don't require a huge amount of change of technology and culture inside an organization. A software collection deployment for your application will start to break down the walls between the developers and the operations teams. In this brave new world of a software collection deployments the developer and operations team members will need to together to develop a successful application software collection. The developer needs to bring a list all the software/libraries that are needed and the operations team needs to bring an understanding of how to deploy that application to fit with in the constructs of change management, internal and external regulations, as well as on going monitoring/alerting. By putting both of these group's knowledge into an applications software collection, it is a stronger guarantee that the deployment will be successful and the on-going support by both teams will be minimized. This is a great way to start your organization down that path of DevOps that is all the buzz.
Summary
The hopes is that after reading this article you can start to see how you can use the software collections utilities to manage the life cycle of your internally developed application in a new way. This new way of deployment can help you address the problems that plaque organizations today but also builds a strong foundation for the inevitable culture and technology changes of tomorrow. These topics have been banging around in my head for a long time and I am finally happy to have some of it posted in this article. My plan is to have follow up articles to this one that provide some concrete examples of the process described above. In the mean time please provide your thoughts and feedback.
Last updated: November 1, 2023