Overview: Integrate a private AI coding assistant into your CDE using Ollama, Continue, and OpenShift Dev Spaces
Unsurprisingly, developers are looking for ways to include powerful new technologies like AI assistants to improve their workflow and productivity. However, many companies are reluctant to allow such technology due to privacy, security, and IP law concerns.
This activity addresses the concerns about privacy and security and describes how to deploy and integrate a private AI assistant in the emulated air-gapped on-premise environment. We will guide you through setting up a cloud development environment (CDE) using Ollama, Continue, Llama3, and Starcoder2 large language models (LLMs) with Red Hat OpenShift Dev Spaces, empowering you to code faster and more efficiently.
Ready to streamline your cloud development workflow and bring some AI into it? Grab your favorite beverage, and let's embark on this journey to unlock the full potential of the cloud development experience!
Prerequisites/In order to get full benefit from taking this lesson, you need to:
In this learning path, you will:
- Set up a CDE using Ollama, Continue, Llama3, and Starcoder2 large language models (LLMs) with Red Hat OpenShift Dev Spaces.