Skip to main content
Redhat Developers  Logo
  • AI

    Get started with AI

    • Red Hat AI
      Accelerate the development and deployment of enterprise AI solutions.
    • AI learning hub
      Explore learning materials and tools, organized by task.
    • AI interactive demos
      Click through scenarios with Red Hat AI, including training LLMs and more.
    • AI/ML learning paths
      Expand your OpenShift AI knowledge using these learning resources.
    • AI quickstarts
      Focused AI use cases designed for fast deployment on Red Hat AI platforms.
    • No-cost AI training
      Foundational Red Hat AI training.

    Featured resources

    • OpenShift AI learning
    • Open source AI for developers
    • AI product application development
    • Open source-powered AI/ML for hybrid cloud
    • AI and Node.js cheat sheet

    Red Hat AI Factory with NVIDIA

    • Red Hat AI Factory with NVIDIA is a co-engineered, enterprise-grade AI solution for building, deploying, and managing AI at scale across hybrid cloud environments.
    • Explore the solution
  • Learn

    Self-guided

    • Documentation
      Find answers, get step-by-step guidance, and learn how to use Red Hat products.
    • Learning paths
      Explore curated walkthroughs for common development tasks.
    • Guided learning
      Receive custom learning paths powered by our AI assistant.
    • See all learning

    Hands-on

    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.
    • Interactive labs
      Learn by doing in these hands-on, browser-based experiences.
    • Interactive demos
      Click through product features in these guided tours.

    Browse by topic

    • AI/ML
    • Automation
    • Java
    • Kubernetes
    • Linux
    • See all topics

    Training & certifications

    • Courses and exams
    • Certifications
    • Skills assessments
    • Red Hat Academy
    • Learning subscription
    • Explore training
  • Build

    Get started

    • Red Hat build of Podman Desktop
      A downloadable, local development hub to experiment with our products and builds.
    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.

    Download products

    • Access product downloads to start building and testing right away.
    • Red Hat Enterprise Linux
    • Red Hat AI
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat Developer Toolset

    References

    • E-books
    • Documentation
    • Cheat sheets
    • Architecture center
  • Community

    Get involved

    • Events
    • Live AI events
    • Red Hat Summit
    • Red Hat Accelerators
    • Community discussions

    Follow along

    • Articles & blogs
    • Developer newsletter
    • Videos
    • Github

    Get help

    • Customer service
    • Customer support
    • Regional contacts
    • Find a partner

    Join the Red Hat Developer program

    • Download Red Hat products and project builds, access support documentation, learning content, and more.
    • Explore the benefits

Inspecting containerized Python applications in a cluster

February 24, 2022
Fridolin Pokorny
Related topics:
Automation and managementKubernetesPython
Related products:
Red Hat OpenShift

    Container technologies that are easy to maintain, extend, ship, and run are the new de facto standard for large-scale application deployments. Thanks to cluster orchestrators such as Kubernetes and Red Hat OpenShift, these runnable units are deployed to clusters to provide the desired functionality on a large scale.

    To ensure the application is shipped in a healthy state, it is often up to developers to confirm that each runnable unit behaves as expected in the environment where it will be deployed. This article introduces Amun, a tool created and used by Project Thoth to inspect containerized Python applications. Running such inspections before deployment can reveal problems up and down the stack—including incompatibilities with dependencies, the operating system, or other parts of the environment.

    Containerized application inspection with Amun

    If you've been reading articles in this series, you might have already seen Amun mentioned in the article Resolve Python dependencies with Thoth Dependency Monkey. We introduced it there as part of Thoth's Dependency Monkey, a service for validating software packages and software stacks while respecting the resolution of different Python libraries. Amun can also be used standalone to test an application runtime environment following specifications for deployment.

    Amun combines Argo Workflows with a service based on OpenShift and exposed as an API to developers who want to test their applications. The open source object storage system Ceph is used to store computed results. Amun's API accepts a specification that lists information about what to test and how.

    An example is a request for Amun to test an application running a Thoth-based source-to-image (S2I) Python application or predictable stacks provided by the Thoth team. The specification also lists the Python libraries that the application requires (by specifying a lock file). All the dependencies are installed in the base container image environment together with a script to test the application. Optionally, users can specify additional input, such as RPM packages that should be installed. Users can also supply additional requests to be respected by the cluster orchestrator when deploying the application in the cluster. For instance, one might specify the features a node should provide to the application, such as a particular CPU or GPU type.

    The specification is in JSON format and is accepted on the Amun API as shown in Figure 1. After validating the specification, Amun instruments OpenShift and Argo Workflows to trigger a so-called inspection of the application. At its core, the inspection consists of two steps: building and testing the containerized application. Both steps are done in the cluster.

    An input specification, containing information about the elements of the application and its deployment, tells the Amun API to trigger an inspection build followed by an inspection run.
    Figure 1. An input specification tells the Amun API to trigger an inspection build followed by an inspection run.

    The build step happens through OpenShift. Once the build is done, the application is run in the cluster conforming to the requirements supplied in the specification. Figure 2 shows the flow of events.

    Amun runs a build and then runs the application, and aggregates information about each stage.
    Figure 2. Amun runs a build and then runs the application, and aggregates the information about each run.

    The output of the inspection consists of JSON reports containing information about the application's build and run. See Thoth's amun-api repository for an example of inspection output in a JSON report.

    The report captures the specification supplied, generated files (such as a Dockerfile), and logs from containers that were run during the build or application run. Reports from the application run also capture information about hardware as reported by the operating system; the actual run results computed by the supplied script; and additional aggregated metadata such as process information from the Linux kernel process control block.

    The specification can ask for a given number of multiple, separate inspection runs. You can follow their progress in the Argo Workflows user interface (UI), as shown in Figure 3.

    Using the Argo Workflows UI, you can watch the progress of the build and runs.
    Figure 3. Using the Argo Workflows UI, you can watch the progress of the build and runs.

    Multiple runs can be especially useful when running microbenchmarks in a cluster to eliminate possible platform or environment inference errors. The amun-api example repository includes three reports.

    Amun and Dependency Monkey

    Amun has many possible uses, but it was developed first as a platform for validating resolutions of dependencies made by Thoth's Dependency Monkey. In this case, Thoth's resolver uses pre-aggregated dependency data stored in Thoth's knowledge database to resolve Python application dependencies (see Figure 4). Once a valid resolution is found, Amun is instrumented to verify that the resolution leads to a healthy application. See the previously mentioned article Resolve Python dependencies with Thoth Dependency Monkey for more information. You can also watch our video tutorial on inspecting Python dependencies with Dependency Monkey.

    Thoth's resolver in Dependency Monkey uses a database with dependency information to run the application in Amun and validate dependency resolution.
    Figure 4. ​​Thoth's resolver in Dependency Monkey uses a database with dependency information to run the application in Amun and validate dependency resolution.

    Conclusion

    Amun was successfully used to produce some of the Thoth datasets, also available on Kaggle. If you wish to use Amun to introspect the behavior of your application or to run Dependency Monkey to check the quality of your application with respect to its dependencies, feel free to reach out to the Thoth team using thoth-station/support repository or via the @ThothStation Twitter handle.

    As part of Project Thoth, we are accumulating knowledge to help Python developers create healthy applications. If you would like to follow updates, please subscribe to our YouTube channel or follow us on the @ThothStation Twitter handle.

    Last updated: September 20, 2023

    Related Posts

    • Resolve Python dependencies with Thoth Dependency Monkey

    • Build and extend containerized applications with Project Thoth

    • Customize Python dependency resolution with machine learning

    • Extracting dependencies from Python packages

    • Microbenchmarks for AI applications using Red Hat OpenShift on PSI in project Thoth

    Recent Posts

    • Every layer counts: Defense in depth for AI agents with Red Hat AI

    • Fun in the RUN instruction: Why container builds with distroless images can surprise you

    • Trusted software factory: Building trust in the agentic AI era

    • Build a zero trust AI pipeline with OpenShift and RHEL CVMs

    • Red Hat Hardened Images: Top 5 benefits for software developers

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Chat Support

    Please log in with your Red Hat account to access chat support.