Skip to main content
Redhat Developers  Logo
  • Products

    Featured

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat OpenShift AI
      Red Hat OpenShift AI
    • Red Hat Enterprise Linux AI
      Linux icon inside of a brain
    • Image mode for Red Hat Enterprise Linux
      RHEL image mode
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • Red Hat Developer Hub
      Developer Hub
    • View All Red Hat Products
    • Linux

      • Red Hat Enterprise Linux
      • Image mode for Red Hat Enterprise Linux
      • Red Hat Universal Base Images (UBI)
    • Java runtimes & frameworks

      • JBoss Enterprise Application Platform
      • Red Hat build of OpenJDK
    • Kubernetes

      • Red Hat OpenShift
      • Microsoft Azure Red Hat OpenShift
      • Red Hat OpenShift Virtualization
      • Red Hat OpenShift Lightspeed
    • Integration & App Connectivity

      • Red Hat Build of Apache Camel
      • Red Hat Service Interconnect
      • Red Hat Connectivity Link
    • AI/ML

      • Red Hat OpenShift AI
      • Red Hat Enterprise Linux AI
    • Automation

      • Red Hat Ansible Automation Platform
      • Red Hat Ansible Lightspeed
    • Developer tools

      • Red Hat Trusted Software Supply Chain
      • Podman Desktop
      • Red Hat OpenShift Dev Spaces
    • Developer Sandbox

      Developer Sandbox
      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Secure Development & Architectures

      • Security
      • Secure coding
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
      • View All Technologies
    • Start exploring in the Developer Sandbox for free

      sandbox graphic
      Try Red Hat's products and technologies without setup or configuration.
    • Try at no cost
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • Java
      Java icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • API Catalog
    • Product Documentation
    • Legacy Documentation
    • Red Hat Learning

      Learning image
      Boost your technical skills to expert-level with the help of interactive lessons offered by various Red Hat Learning programs.
    • Explore Red Hat Learning
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Creating an AI-powered service for detecting fraudulent card transactions

Train and deploy an AI model using OpenShift AI, then integrate it into an application running on OpenShift

July 29, 2024
Yashwanth Maheshwaram
Related topics:
Artificial intelligenceKubernetesPython
Related products:
Red Hat OpenShift AI

Share:

    As AI and machine learning become integral to modern applications, developers need robust platforms that simplify the development and deployment process. Red Hat OpenShift AI offers a powerful and flexible environment tailored for these needs, making it easier for AI engineers and DevOps teams to collaborate and deliver high-quality AI solutions. In this blog post, we’ll explore how to set up a fraud detection system using Red Hat OpenShift AI, complete with a practical example and a demo API.

    Developer experience with OpenShift

    OpenShift AI simplifies the journey for teams working on AI applications. Utilizing OpenShift AI and Red Hat OpenShift, developers can enjoy a streamlined and efficient workflow. Let's take a look at an overview of the process:

    OpenShift Developer Experience for Intelligent Applications
    1. Access Jupyter Notebook: Begin by accessing the Jupyter Notebook on OpenShift AI. This environment is ideal for AI engineers to experiment and develop their models.
    2. Gather & Prepare Data: Collect and preprocess the data necessary for training the fraud detection model. Ensuring high-quality data is crucial for building an accurate model.
    3. Build the Model: Develop the fraud detection model using machine learning algorithms. The Jupyter Notebook provides a convenient interface for this step.
    4. Train the Model: Train your model using the prepared dataset. This involves adjusting the model parameters to improve its accuracy in detecting fraudulent activities.
    5. Save the Model: Once the model is trained, save it for future deployment. This step ensures that the model can be easily accessed and utilized in the application.
    6. Deploy the Model: Deploy the trained model using OpenShift AI. This platform streamlines the deployment process, making it simple for AI engineers and DevOps teams to collaborate.
    7. Integrate the Model: Integrate the deployed model into your application locally. This step involves ensuring that the application can interact with the model to perform real-time fraud detection.
    8. Deploy the Application: Finally, deploy the application, ensuring that the fraud detection model is fully functional within the operational environment.

    Red Hat Developer Sandbox, along with Red Hat OpenShift and Red Hat OpenShift AI are free to use and readily available for to try out the developer experience with these products. Signup for the Red Hat Developer Sandbox now!

    Build, train, store, and serve the model using OpenShift AI Sandbox

    Red Hat OpenShift AI streamlines the entire lifecycle of AI models. You can build and train your machine learning models within a Jupyter Notebook environment on OpenShift AI, ensuring a seamless development process. Once trained, models are easily stored using integrated solutions like Minio, providing secure and scalable storage. Deploying these models for serving is straightforward with OpenShift AI, enabling efficient and reliable inference. This end-to-end support simplifies AI workflows, allowing teams to focus on innovation.

    For a detailed example of implementing a fraud detection system, you can refer to the OpenShift AI tutorial - Fraud detection example. This tutorial provides a comprehensive guide to setting up and deploying a fraud detection model using OpenShift AI. 

    By the end of 4.2 on the article, you will have a model deployed on OpenShift AI. Once you have that we will need a service that does the job of connecting to the model. 

    Model Serving on OpenShift AI

     

    Create a service that talks to the model

    Integrate the fraud detection AI model into your API to identify fraudulent transactions. The provided Python application uses Flask to set up an API endpoint that processes transaction data. It loads a pre-trained scaler for data normalization and constructs a request to the model server. Depending on the environment, the app uses a different URL for the model server, ensuring flexibility. When a transaction is submitted, the app preprocesses the data, sends it to the model server for inference, and returns a response indicating if the transaction is fraudulent. For more details, see app.py.

    Here’s a brief overview of what's going on:

    1. Receives data via POST request
    2. Scales the data using a pre-loaded scaler
    3. Sends the scaled data to a model server
    4. Processes the model’s response to classify the data as ‘fraud’ or ‘not fraud’
    5. Responds with the classification result

    This internal platform service enables other applications on your cluster to interface with the AI model. The service accepts an array in the format:

    [distance, ratio_to_median, pin, chip, online]

    It responds with a message indicating whether the transaction is fraudulent:

    {
       "message": "fraud/not fraud"
    }

    Deploying the service to the Developer Sandbox

    OpenShift simplifies application deployment through its Source-to-Image (S2I) feature, making it an ideal choice for developers who want to deploy applications quickly without diving into the complexities of container management. S2I streamlines the process by automating the creation of container images from source code. OpenShift’s S2I feature makes application deployment fast and efficient, allowing developers to focus on coding rather than managing containers. This streamlined approach not only speeds up the development process but also simplifies the deployment of applications in a modern cloud-native environment.

    1. Copy the Repository URL: Start by copying the URL of your repository.
    2. Log in to OpenShift Developer Sandbox: Access your OpenShift Developer Sandbox and log in.
    3. Navigate to the Developer View: Once logged in, switch to the Developer view.
    4. Add a New Application: Click on the “+Add” button.
    5. Import from Git: Select the “Import from Git” option.
      1. Import from git
    6. Paste Repository URL: Paste the URL of your repository into the provided field.
    7. Configure Port and Create: Set the port to 5000 and click on the “Create” button. 
      1. Deployment
    8. . Monitor Build Progress: Wait for the build process to complete. Once finished, you should see your application in the topology view.
      1. Topology
    9. Test Your Service: To verify the deployment, execute a POST request in your terminal:
      1. curl -X POST http://<ENTER YOUR ENDPOINT> -H "Content-Type: application/json" -d '{"data": [100, 1.2, 0.0, 0.0, 1.0]}'

    Test a bit more

    Case 1: Not a fradulent transaction

    In this example, the user is buying a coffee. The parameters given to the model are:
    • same location as the last transaction (distance=0)
    • same median price as the last transaction (ratio_to_median=1)
    • using a pin number (pin=1)
    • using the credit card chip (chip=1)
    • not an online transaction (online=0)
    `[0.0, 1.0, 1.0, 1.0, 0.0]`
    curl -X POST http://<ENTER YOUR ENDPOINT> -H "Content-Type: application/json" -d '{"data": [0.0, 1.0, 1.0, 1.0, 0.0]}'

    Case 2: Fraudulent transaction

    In this example, someone stole the user's credit card and is buying something online. The parameters given to the model are:
    • very far away from the last transaction (distance=100)
    • median price similar to the last transaction (ratio_to_median=1.2)
    • not using a pin number (pin=0)
    • not using the credit card chip (chip=0)
    • is an online transaction (online=1)
    [100, 1.2, 0.0, 0.0, 1.0]
    curl -X POST http://<ENTER YOUR ENDPOINT> -H "Content-Type: application/json" -d '{"data": [100, 1.2, 0.0, 0.0, 1.0]}'
    
    Try your own requests and have some fun :)

    By leveraging Red Hat OpenShift AI, businesses can significantly enhance their Developer Expeirence for AI Engineers, Platform Engineers and Application Developers. The platform with Red Hat OpenShift and OpenShift AI simplifies the development and deployment process, providing a seamless experience for AI engineers and DevOps teams. By following the steps outlined in this blog post and utilizing the provided resources, you can build, deploy and serve intelligent applications that leverage AI models.

    Last updated: August 1, 2024

    Related Posts

    • How to integrate Quarkus applications with OpenShift AI

    • How to train a BERT machine learning model with OpenShift AI

    • Create an OpenShift AI environment with Snorkel

    • Red Hat OpenShift AI installation and setup

    • ​​Try OpenShift AI and integrate with Apache Camel

    • Model training in Red Hat OpenShift AI

    Recent Posts

    • How Kafka improves agentic AI

    • How to use service mesh to improve AI model security

    • How to run AI models in cloud development environments

    • How Trilio secures OpenShift virtual machines and containers

    • How to implement observability with Node.js and Llama Stack

    What’s up next?

    Learning Path LLM Node.js learning path feature image

    How to get started with large language models and Node.js

    Learn how to access a large language model using Node.js and LangChain.js....
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue