Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • View All Red Hat Products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Secure Development & Architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • Product Documentation
    • API Catalog
    • Legacy Documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Implement AI-driven edge to core data pipelines

May 24, 2024
Bruno Meseguer
Related topics:
Artificial intelligenceData ScienceEdge computingIntegration
Related products:
Red Hat build of Apache CamelRed Hat OpenShiftRed Hat OpenShift AIRed Hat OpenShift Data FoundationRed Hat Service Interconnect

Share:

    The interaction of AI/ML and edge computing occurs in a variety of businesses. For example, in the retail industry, customer behavior and patterns can be used to determine promotions. In a manufacturing facility, likewise, data inference can help improve processes where machines are failing, products are not up to standard, and so on. The basic premise is that you might collect data and then infer from it in order to make decisions based on it.

    In the context of this retail use case example, a superstore company has branches spread across the country. Each branch decides the products it wants to sell and can customize an AI/ML intelligent app for customers to use.

    This article presents a video demonstration illustrating an opinionated solution capable of automating a continuous cycle for releasing and deploying new AI/ML models.

    About the solution pattern

    The solution goes well beyond the core topic of creating and inferencing AI/ML models. It provides all the supporting systems and data pipelines, working in concert, to execute end-to-end workflows that ensure the most up-to-date information is delivered to clients.

    As the business evolves and grows, new products are offered to customers. When adding new products to the platform, upgrade cycles are initiated by acquiring and moving product information over data bridges between the near-edge and core data centers. Figure 1 depicts this process.

    alt text
    Figure 1: Transporting product information over data bridges between the near-edge and core data centers as the platform evolves.

    Capabilities on top of Red Hat OpenShift, such as highly dynamic connectivity with Red Hat Service Interconnect, integration flows with the Red Hat build of Apache Camel, in-sync platform automation with Tekton and Red Hat OpenShift AI, and object storage with OpenShift Data Foundation are all combined in the ecosystem to form a closed loop, allowing the platform to evolve with agility.

    Video demo

    Watch the demonstration in the embedded video below:

    Additional resources

    The solution is well documented under the Solution Patterns portal. You can read more detailed information about the use case, its architecture, and even find guided instructions on how to provision and run it yourself.

    You can explore more solution patterns that show how the different Red Hat technologies can be used together to solve business needs elegantly.

    Keep learning by following the resources listed below:

    • Find detailed information about this article’s demo in the Solution Pattern portal.
    • Explore other solution patterns.
    • Read the Apache Camel page on Red Hat Developer to learn more about the capabilities of Apache Camel.
    • Use Service Interconnect to connect applications and microservices. 
    • Create and manage your AI/ML models using Red Hat OpenShift AI.

    Related Posts

    • Developing at the edge: Best practices for edge computing

    • Prepare and label custom datasets with Label Studio

    • 5 things developers should know about edge computing

    • Red Hat OpenShift AI installation and setup

    • Comparing OpenShift Service Mesh and Service Interconnect

    • Migrating from Red Hat Fuse to Red Hat build of Apache Camel

    Recent Posts

    • Migrating Ansible Automation Platform 2.4 to 2.5

    • Multicluster resiliency with global load balancing and mesh federation

    • Simplify local prototyping with Camel JBang infrastructure

    • Smart deployments at scale: Leveraging ApplicationSets and Helm with cluster labels in Red Hat Advanced Cluster Management for Kubernetes

    • How to verify container signatures in disconnected OpenShift

    What’s up next?

    In this activity, you will connect services distributed across multiple OpenShift clusters using Red Hat Service Interconnect to establish a Virtual Applications Network and seamlessly leverage a GitOps workflow for service management.

    Start the activity
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue