In a previous post, where-have-all-my-subscriptions-gone, I mentioned that you can access the Red Hat Content Delivery Network (CDN) using its API --- allowing you to query CDN for subscriptions and their usage, registered hosts, and more as well as unregistering hosts, and more.

I wanted to do some analysis for my own subscription usage, so I wrote some scripts that let me more easily tell where my subscriptions are being used.

Since Python scripting is still fairly new to me, and I wanted to learn something new, I decided I would write the primary script using Python 3.

For my use, I needed the scripts to:

  • Tell me which systems are using my subscriptions and pool IDs.
  • When did the system last check in.
  • List any systems with duplicate names. This is an indication that the systems were re-installed without first being unregistered.
  • When did the systems last check in. A system that no longer checks in may no longer exist.

After a little work, I had a script that could give me what I wanted. The script can generate 3 basic reports for me.

  • A pool/subscription usage report
  • A duplicate systems report
  • A report of when systems last checked in and can show me only reports that have not checked in longer than some determined number of days.

Here are some sample reports generated by my Python script. The script accepts the --help option to give details on its options.

Pool Usage Report

            Name                 | Pool ID      | Quantity | Consumed | Exported
Subscription 1                   | 123456abcdef | 300      | 101      | 30
     Attached Systems: | System ID              | Name
                       | 12345678-abcd-1a2b3c4d | system-1 Last Checkin (days): 50
                       | 12345678-abcd-1a2eeeee | system-2 Last Checkin (days): 0

Subscription 2                   | aabbcc1122dd | 10 | 8 | 0
     Attached Systems: | System ID              | Name
                       | bcdef678-af5d-1a2cfd4d | system-3 Last Checkin (days): 9
                       | 12346fde-aeed-1a2abdce | system-4 Last Checkin (days): 5

Duplicate Systems Report

Hosts with duplicates: 197
Duplicate systems: 276
Freeable systems: 79
Count | Name         Last Checkin (EPOCH) |    ID 
# 11  system-1         1466648032           12345678-abcd-1a2b3c4d
                       1466639184           1b2b32b3-1234-867ab210
                       1466132041           3b2ds525-abdd-a1b1c1d1
                       1465339439           3232bb32-43bc-abcdabcd
                       1464219749           423443dd-7652-12341234
# 10  system-2         1466649410           12345678-abcd-1a2eeeee
                       1466638967           3421dd11-abcd-bdcdeeed
                       1465339174           787dbb8a-42dc-abcdef11
# 10  system-3         1466649256           bcdef678-af5d-1a2cfd4d
                       1466638709           678acb26-6421-bcccad12
                       1464196357           5673ffff-ab12-123bcddd


Last Check-In Report

Host     | ID                     | Last Checking (Days) | Entitlements Consumed
system-1 | 12345678-abcd-1a2b3c4d | 162                  | 1
system-2 | 12345678-abcd-1a2eeeee | 156                  | 1
system-3 | bcdef678-af5d-1a2cfd4d | 156                  | 0
system-4 | 23422323-1234-11223344 | 155                  | 0
system-5 | ababaaba-4321-aabbccdd | 153                  | 1
system-2 | 3421dd11-abcd-bdcdeeed | 153                  | 0

Now that I have my reports, I can see that I have hundreds of systems I need to remove. That is a lot of pointing and clicking in the Red Hat Content Delivery Network (CDN) portal.

API to the rescue again! It allows me to remove systems and free up the subscriptions as well. This is awesome, life is good.

I wrote a bash script to remove systems from being registered. Sorry no Python here, I am being a bit lazy and I know I can get the bash script done quicker than writing another Python script. It uses the following curl command to remove a host from being registered to CDN, I wrote a bash script that will take a list of system UUIDs from standard input.

curl --silent -X DELETE -u CDN_USER:CDN_PASSWORD -k ""

I went back and added an option to the Python script to tell it to not print the header information. I also modified the output of the duplicate systems report to add a hash mark in front of the most recently checked in system. This will allow me to pass the -v option to grep so it ignores the most recently checked in system in each set of duplicates. With a little help from awk, piping the output of the duplicates report through grep and then awk gives me a list of system UUIDs  that can be used as input to the list of hosts to remove.

The Last Check-In report can either display the last check-in for all the registered systems by using the --checkin option alone. Or the report can be generated to list the systems that have not registered within a certain number of days by using --checkin and --days options.

Now I can easily remove all my duplicate systems, except the most recently registered one, and I can remove all systems that have not checked in recently.

I imagine others might find these scripts useful as well, so I am making them available via GitHub.

The README file should be fairly explanatory on how to use the scripts, so I am not including the information here since it would make this blog post very long.

Please be careful using these scripts, they are only tested when I need to clean up my systems in CDN.

Enjoy and I hope others find them useful as well. Better yet, make something even better from them.

Last updated: September 5, 2023