Skip to main content
Redhat Developers  Logo
  • Products

    Featured

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat OpenShift AI
      Red Hat OpenShift AI
    • Red Hat Enterprise Linux AI
      Linux icon inside of a brain
    • Image mode for Red Hat Enterprise Linux
      RHEL image mode
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • Red Hat Developer Hub
      Developer Hub
    • View All Red Hat Products
    • Linux

      • Red Hat Enterprise Linux
      • Image mode for Red Hat Enterprise Linux
      • Red Hat Universal Base Images (UBI)
    • Java runtimes & frameworks

      • JBoss Enterprise Application Platform
      • Red Hat build of OpenJDK
    • Kubernetes

      • Red Hat OpenShift
      • Microsoft Azure Red Hat OpenShift
      • Red Hat OpenShift Virtualization
      • Red Hat OpenShift Lightspeed
    • Integration & App Connectivity

      • Red Hat Build of Apache Camel
      • Red Hat Service Interconnect
      • Red Hat Connectivity Link
    • AI/ML

      • Red Hat OpenShift AI
      • Red Hat Enterprise Linux AI
    • Automation

      • Red Hat Ansible Automation Platform
      • Red Hat Ansible Lightspeed
    • Developer tools

      • Red Hat Trusted Software Supply Chain
      • Podman Desktop
      • Red Hat OpenShift Dev Spaces
    • Developer Sandbox

      Developer Sandbox
      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Secure Development & Architectures

      • Security
      • Secure coding
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
      • View All Technologies
    • Start exploring in the Developer Sandbox for free

      sandbox graphic
      Try Red Hat's products and technologies without setup or configuration.
    • Try at no cost
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • Java
      Java icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • API Catalog
    • Product Documentation
    • Legacy Documentation
    • Red Hat Learning

      Learning image
      Boost your technical skills to expert-level with the help of interactive lessons offered by various Red Hat Learning programs.
    • Explore Red Hat Learning
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Apply generative AI to app modernization with Konveyor AI

May 7, 2024
Syed M Shaaf John Matthews
Related topics:
Artificial intelligenceIDEsJavaApplication modernizationQuarkus
Related products:
Red Hat build of Quarkus

Share:

    The Konveyor community has developed Konveyor AI (Kai), a tool that uses generative AI to accelerate application modernization. Kai integrates large language models (LLMs) with static code analysis to facilitate code modifications within a developer's IDE, helping transition to other modern programming languages and frameworks efficiently. 

    Using a retrieval-augmented generation (RAG) approach, Kai enhances LLM outputs with historical code changes and analysis data, ensuring context-specific guidance. This method is model agnostic and does not require model fine-tuning, making Kai a versatile tool for large-scale modernization projects, demonstrated by updating a Java EE application to Quarkus using a Visual Studio Code (VS Code) plug-in. This unique approach will enable organizations to shorten the time and cost of modernization at scale. A developer can see a list of issues in their IDE that need to be addressed to migrate to a new technology, and Kai will work with an LLM to generate the required source code changes.

    What is Konveyor?

    Konveyor is an upstream Cloud Native Computing Foundation (CNCF) sandbox project that helps organizations manage large-scale modernization engagements for their entire portfolio of applications. Much of the work in Konveyor is centered on surfacing information about legacy applications to empower enterprise architects to make better-informed decisions (see Figure 1).  

    The Konveyor project helps modernize applications by providing tools to rehost, replatform, and refactor applications to Kubernetes and cloud-native technologies. Source: konveyor.io.
    Figure 1: The Konveyor project helps modernize applications by providing tools to rehost, replatform, and refactor applications to Kubernetes and cloud-native technologies. Source: konveyor.io.

    One way Konveyor surfaces information is through static code analysis using a tool called analyzer-lsp. This tool enables Konveyor to analyze code written in various languages and technologies. It does so by using the Language Server Protocol (LSP), language providers, and analysis rules, which include both community-contributed rules and the option of custom rules an organization creates for their specific components.

    Through static code analysis, an organization can identify potential areas of concern in a given application. This report, for example, examines issues that should be considered when migrating a sample Java EE application (coolstore) to Quarkus.

    What is Konveyor AI (Kai)?

    Konveyor AI is a Konveyor component that implements a retrieval-augmented generation (RAG) pattern applied to the application modernization domain. Kai enables an organization to use an LLM of their choice and augment the model's knowledge by gathering related information to aid in modernization tasks, thereby avoiding fine-tuning or training the model.

    The typical RAG pattern involves gathering relevant information about a given question and bundling it with the request to an LLM. This way, an LLM can potentially answer questions it hasn’t seen in training. It is a popular approach to making an LLM more powerful without fine-tuning or training on new data.  

    Kai uses this RAG approach to provide the LLM with 2 types of information found within a Konveyor instance, as shown in Figure 2:

    • Analysis information: Static code analysis information with hints of how to solve an issue.
    • Solved issues: Code snippets of how similar problems were solved in the past by this organization.
    Diagram illustrating Konveyor's RAG approach.
    Figure 2: Diagram illustrating Konveyor's RAG approach.

    RAG approach: Analysis information

    Analysis information from analyzer-lsp identifies specific issues that should be considered when migrating an application to a new technology.  

    This information consists of what we call an "incident." An "incident" will include:

    • The file path of the impacted file.
    • Line number of the incident.
    • Message which describes the issue and potentially a hint of how to address.

    Konveyor provides a web UI-formatted report that is easier for end users to consume, and it also provides a raw data in a YAML file.  

    We can look at the raw data below for an example of a Quarkus rule that informs us we are using a JMS topic and need to change. When we include this information in the prompt to an LLM, it then helps the LLM to know precisely what we want to change, thereby improving the results over a more naive approach of simply asking the LLM “Help me migrate this file to Quarkus” with no context. This technique gets more potent as we consider the custom rules an organization creates for its custom frameworks.

    “JMS' Topic must be replaced with an Emitter” from run ruleID: jms-to-reactive-quarkus-00040.

      incidents:
          - uri: file:///src/main/....../service/InventoryNotificationMDB.java
            message: "JMS `Topic`s should be replaced with Micrometer `Emitter`s feeding a Channel. See the following example of migrating\n a Topic to an Emitter:\n \n Before:\n ```\n @Resource(lookup = \"java:/topic/HELLOWORLDMDBTopic\")\n private Topic topic;\n ```\n \n After:\n ```\n @Inject\n @Channel(\"HELLOWORLDMDBTopic\")\n Emitter<String> topicEmitter;\n ```"
            lineNumber: 60

    The example above depicts the RAG approach to "solved issues" and includes code snippets of similar solved problems.

    Konveyor’s Hub component provides a view of an organization's entire application portfolio; this includes a history of analysis information over time as applications have been migrated and problems have been solved. Kai can tap into this information the organization has gathered in Konveyor in addition to looking at commits in each application’s git repo to form code snippets we call "solved issues" that potentially give the LLM an additional set of information to help it learn how the organization has addressed an analysis issue in the past.  

    The inclusion of "solved issues" is optional and not required to use Kai. However, it becomes a powerful addition to the approach as organizations proceed with large-scale modernization engagements involving hundreds to thousands of applications being migrated to new technologies.

    Kai demonstration

    We have a demonstration of using Kai with an IDE plug-in for VS Code that aids the migration process of updating a traditional Java EE application, coolstore, to run with Quarkus.  In the IDE we can run Konveyor’s static code analysis and view identified issues, then send requests to Kai to help generate fixes for the problems identified. See Figure 3.

    Using Konveyor AI with the VS Code IDE plug-in.
    Figure 3: Using Konveyor AI with the VS Code IDE plug-in.

    Recap of Konveyor AI (Kai)

    Kai:

    • Works with an LLM to help update a source code file to a new technology.
      • In the above demonstration, we focused on a legacy Java EE and Quarkus, yet this approach is independent of a specific technology. It is compatible with any languages/rules Konveyor knows about, assuming the used LLM also knows about that language.
    • Uses a RAG approach based on:
      • Static code analysis information to identify incidents to fix, along with potential hints of how to address.
      • [optional] solved code snippets that show how the organization has solved a similar problem in the past.
    • Is model agnostic.
      • Users can bring a model of their choice to run against.
      • Kai can tweak prompts for various models to conform to recommended model-specific prompting strategies.
    • Does not require fine-tuning a model.
    • Is an early project; we are working towards an alpha release in the summer.
      • Back end code
      • VS Code IDE plug-in

    Next steps

    If you would like to learn more about Kai, check out our in-depth technical deep dive at konveyor.io.

    Consider joining the Konveyor community.

    We host bi-weekly community calls that are available on YouTube; we would love to see you there!

    Related Posts

    • Using Kubernetes ConfigMaps to define your Quarkus application's properties

    • Kubernetes-native Spring apps on Quarkus

    • Application modernization patterns with Apache Kafka, Debezium, and Kubernetes

    • Modernizing Pedal: Breaking down a Javå monolith into Quarkus microservices

    • Modernization: A reference approach

    • Understand modernization of a traditional Java application with an example

    Recent Posts

    • Container starting and termination order in a pod

    • More Essential AI tutorials for Node.js Developers

    • How to run a fraud detection AI model on RHEL CVMs

    • How we use software provenance at Red Hat

    • Alternatives to creating bootc images from scratch

    What’s up next?

    Learn how to access a large language model using Node.js and LangChain.js. You’ll also explore LangChain.js APIs that simplify common requirements like retrieval-augmented generation (RAG).

    Start the activity
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue