Principal Technical Marketing Manager

Ortwin Schneider

Ortwin has more than 15 years of professional experience in developing individual software solutions for various industries. His focus was on solutions based on JavaEE, Angular and mobile applications as well as using an MDD/MDA (Model Driven Development / Model Driven Architecture) approach. He is highly interested in software architectures leveraging the hybrid cloud and cloud-native model. Over the past 9 years he was leading an agile development team, supporting customers to provide business value and sustain agility. Today Ortwin is helping Red Hat customers on adopting the cloud native path with Red Hat’s comprehensive middleware portfolio

Ortwin Schneider's contributions

Video Thumbnail
Video

Bobbycar, a Red Hat Connected Vehicle Architecture Solution Pattern - Part 1: Automotive Use Cases

Ortwin Schneider

This Red Hat solution pattern implements key aspects of a modern IoT/edge architecture in an exemplary manner. It uses Red Hat OpenShift Container Platform and various middleware components optimized for cloud-native use. This enterprise architecture can serve as a foundation for an IoT/edge hybrid cloud environment supporting various use cases like over-the-air (OTA) deployments, driver monitoring, AI/ML, and others. Bobbycar aims to showcase an end-to-end workflow, from connecting in-vehicle components to a cloud back-end, processing telemetry data in batch or as stream, and training AI/ML models, to deploying containers through a DevSecOps pipeline and by leveraging GitOps to the edge.

Video Thumbnail
Video

Red Hat OpenShift Serverless - Part 1: Knative Serving

Ortwin Schneider

Red Hat OpenShift Serverless is based on the Knative open source project. It provides a uniform and tightly integrated interface across the hybrid cloud ecosystem. This video covers how to deploy and update applications, route traffic to applications, and auto-scaling.

Video Thumbnail
Video

Processing IoT data and serving AI/ML models with OpenShift Serverless

Ortwin Schneider

Explore Knative Serving, Eventing, and Functions through an example use case. You’ll see how to collect telemetry data from simulated vehicles, process the data with OpenShift Serverless, and use the data to train a machine learning model with Red Hat OpenShift AI, Red Hat's MLOps platform. The model will then be deployed as a Knative Service, providing the inference endpoint for our business application.