Principal Technical Marketing Manager

Ortwin Schneider

Ortwin has more than 20 years of professional experience in developing individual software solutions for various industries. In the past, he led an agile development team, supporting customers in delivering business value and sustaining agility. Today, he is passionate about helping customers adopt cloud-native technologies and successfully transition their applications and platforms to hybrid cloud and cloud-native models. He is currently working as a Principal Technical Marketing Manager at Red Hat in the Hybrid Platforms Business Unit, focusing on creating cloud-native solutions that enable customers to build, modernize, and operate scalable and resilient applications.

Ortwin Schneider's contributions

Video Thumbnail
Video

Bobbycar, a Red Hat Connected Vehicle Architecture Solution Pattern - Part 1: Automotive Use Cases

Ortwin Schneider

This Red Hat solution pattern implements key aspects of a modern IoT/edge architecture in an exemplary manner. It uses Red Hat OpenShift Container Platform and various middleware components optimized for cloud-native use. This enterprise architecture can serve as a foundation for an IoT/edge hybrid cloud environment supporting various use cases like over-the-air (OTA) deployments, driver monitoring, AI/ML, and others. Bobbycar aims to showcase an end-to-end workflow, from connecting in-vehicle components to a cloud back-end, processing telemetry data in batch or as stream, and training AI/ML models, to deploying containers through a DevSecOps pipeline and by leveraging GitOps to the edge.

Video Thumbnail
Video

Red Hat OpenShift Serverless - Part 1: Knative Serving

Ortwin Schneider

Red Hat OpenShift Serverless is based on the Knative open source project. It provides a uniform and tightly integrated interface across the hybrid cloud ecosystem. This video covers how to deploy and update applications, route traffic to applications, and auto-scaling.

Video Thumbnail
Video

Processing IoT data and serving AI/ML models with OpenShift Serverless

Ortwin Schneider

Explore Knative Serving, Eventing, and Functions through an example use case. You’ll see how to collect telemetry data from simulated vehicles, process the data with OpenShift Serverless, and use the data to train a machine learning model with Red Hat OpenShift AI, Red Hat's MLOps platform. The model will then be deployed as a Knative Service, providing the inference endpoint for our business application.