Skip to main content
Redhat Developers  Logo
  • AI

    Get started with AI

    • Red Hat AI
      Accelerate the development and deployment of enterprise AI solutions.
    • AI learning hub
      Explore learning materials and tools, organized by task.
    • AI interactive demos
      Click through scenarios with Red Hat AI, including training LLMs and more.
    • AI/ML learning paths
      Expand your OpenShift AI knowledge using these learning resources.
    • AI quickstarts
      Focused AI use cases designed for fast deployment on Red Hat AI platforms.
    • No-cost AI training
      Foundational Red Hat AI training.

    Featured resources

    • OpenShift AI learning
    • Open source AI for developers
    • AI product application development
    • Open source-powered AI/ML for hybrid cloud
    • AI and Node.js cheat sheet

    Red Hat AI Factory with NVIDIA

    • Red Hat AI Factory with NVIDIA is a co-engineered, enterprise-grade AI solution for building, deploying, and managing AI at scale across hybrid cloud environments.
    • Explore the solution
  • Learn

    Self-guided

    • Documentation
      Find answers, get step-by-step guidance, and learn how to use Red Hat products.
    • Learning paths
      Explore curated walkthroughs for common development tasks.
    • Guided learning
      Receive custom learning paths powered by our AI assistant.
    • See all learning

    Hands-on

    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.
    • Interactive labs
      Learn by doing in these hands-on, browser-based experiences.
    • Interactive demos
      Click through product features in these guided tours.

    Browse by topic

    • AI/ML
    • Automation
    • Java
    • Kubernetes
    • Linux
    • See all topics

    Training & certifications

    • Courses and exams
    • Certifications
    • Skills assessments
    • Red Hat Academy
    • Learning subscription
    • Explore training
  • Build

    Get started

    • Red Hat build of Podman Desktop
      A downloadable, local development hub to experiment with our products and builds.
    • Developer Sandbox
      Spin up Red Hat's products and technologies without setup or configuration.

    Download products

    • Access product downloads to start building and testing right away.
    • Red Hat Enterprise Linux
    • Red Hat AI
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat Developer Toolset

    References

    • E-books
    • Documentation
    • Cheat sheets
    • Architecture center
  • Community

    Get involved

    • Events
    • Live AI events
    • Red Hat Summit
    • Red Hat Accelerators
    • Community discussions

    Follow along

    • Articles & blogs
    • Developer newsletter
    • Videos
    • Github

    Get help

    • Customer service
    • Customer support
    • Regional contacts
    • Find a partner

    Join the Red Hat Developer program

    • Download Red Hat products and project builds, access support documentation, learning content, and more.
    • Explore the benefits

OpenCode: A model-neutral AI coding assistant for OpenShift Dev Spaces

April 22, 2026
Rohan Kumar
Related topics:
Artificial intelligenceDeveloper toolsIDEs
Related products:
Developer SandboxRed Hat OpenShift Dev Spaces

    AI coding assistants have shifted from novelty to necessity, changing how developers work. However, most current tools come with a hidden constraint: they are tightly coupled to a primary model provider.

    What if your development environment could switch between GPT, Claude, Gemini, or even local models without changing tools?

    That's where OpenCode comes in. When combined with cloud development platforms like Red Hat OpenShift Dev Spaces (based on Eclipse Che), it unlocks a flexible and effective new way to build with AI.

    OpenCode: A model-neutral AI coding assistant

    OpenCode CLI is part of a new wave of agentic tools designed to act as intelligent coding assistants. It is available as a terminal interface, desktop application, and IDE extension, making it flexible for different development workflows.

    While it sits alongside tools like Codex CLI, Claude Code, and Gemini CLI, OpenCode CLI takes a fundamentally different approach by being model-neutral. Instead of being tied to a single provider, OpenCode supports a wide range of providers (over 75 as per current documentation), including OpenAI, Anthropic Claude, Google Gemini, and local large language models (LLMs) via Ollama.

    This allows developers to switch models on demand, compare outputs, avoid vendor lock-in, and even run fully offline when configured with local models (for example, Ollama), subject to hardware constraints.

    How OpenCode differs from other CLI agents

    While it might seem like just another agentic CLI, OpenCode distinguishes itself by supporting a wide range of model providers. The following table shows a high-level comparison of OpenCode CLI versus other agentic CLIs.

    CapabilityOpenCode CLITypical agentic CLIs
    LicenseOpen source (MIT)Proprietary
    Model supportMulti-model (75+)Single provider
    Vendor lock-inNoYes
    Local model supportYesLimited or none
    Cost optimizationFlexibleRestricted
    Air-gapped compatibilityPossibleRare
    Enterprise flexibilityHighModerate to low
    Workflow changes requiredMinimalOften required

    Run OpenCode in Red Hat OpenShift Dev Spaces

    We will run OpenCode inside our cloud development environment, powered by OpenShift Dev Spaces, as illustrated in Figure 1. This setup lets developers use AI assistance in their workspace and switch models during development without interruption.

    Figure 1 : Using OpenCode inside Red Hat OpenShift Devspaces to connect to on prem or cloud hosted LLM providers
    Figure 1 : Using OpenCode inside Red Hat OpenShift Devspaces to connect to on prem or cloud hosted LLM providers
    Figure 1: Using OpenCode inside Red Hat OpenShift Dev Spaces to connect to on-premises or cloud-hosted LLM providers.

    How to set up your environment

    Before you begin, ensure you have a Developer Sandbox account. After you create an account, you should be able to access OpenShift Dev Spaces. 

    Follow these steps to set up your cloud development environment:

    1. To access Red Hat OpenShift Dev Spaces, navigate to the user dashboard (Figure 2).

      Figure 2: Red Hat OpenShift Dev Spaces User Dashboard
      Figure 2: Red Hat OpenShift Dev Spaces User Dashboard
      Figure 2: Red Hat OpenShift Dev Spaces user dashboard.
    2. To access OpenCode, ensure it is available in your development environment. We will use this repository, which contains the OpenCode CLI and other AI tools.
    3. In the user dashboard, go to the Create Workspace tab and enter the repository URL for this activity: https://github.com/che-incubator/cli-ai-tools/. Then select Create & Open (Figure 3).

      Figure 3: Starting Cloud Development Environment from GitHub URL
      Figure 3: Starting Cloud Development Environment from GitHub URL
      Figure 3: Starting the cloud development environment from the GitHub URL.
    4. During initialization, you might see a prompt asking if you trust the authors of the files in this workspace (Figure 4). To proceed, select Yes, I trust the authors.

      Figure 4: Visual Studio Code - Open Source ("Code - OSS") Warning Pop-Up
      Figure 4: Visual Studio Code - Open Source ("Code - OSS") Warning Pop-Up
      Figure 4: VS Code - Open Source ("Code - OSS") warning pop-up.
    5. On the left sidebar menu, choose Terminal and open it (Figure 5).

      Figure 5 : Opening terminal in Red Hat OpenShift DevSpaces Cloud IDE
      Figure 5 : Opening terminal in Red Hat OpenShift DevSpaces Cloud IDE
      Figure 5: Opening the terminal in the OpenShift Dev Spaces cloud IDE.
    6. When the terminal comes up, type opencode and wait for OpenCode to launch, as shown in Figure 6.
    7. Figure 5 : OpenCode launched in terminal in cloud development environment
      Figure 5 : OpenCode launched in terminal in cloud development environment
      Figure 6: OpenCode launched in the terminal.

    Install OpenCode from the VS Code Marketplace

    Alternatively, you can install the OpenCode extension from the VS Code Marketplace (Figure 7). This still requires you to have the opencode binary in your environment.

    Figure 6: OpenCode extension in VSCode marketplace
    Figure 6: OpenCode extension in VSCode marketplace
    Figure 7: OpenCode extension in the VS Code Marketplace.

    Once OpenCode is ready, it usually defaults to OpenCode Zen. Most popular model providers are preloaded. You can add the credentials for the model provider through the /connect command. Figure 8 shows the configuration for the Google Gemini model.

    Figure 7: Choosing Large Language Model provider via OpenCode /connect command
    Figure 7: Choosing Large Language Model provider via OpenCode /connect command
    Figure 8: Choosing a large language model provider using the OpenCode /connect command.

    Configure authentication with environment variables

    You can also preconfigure credentials using environment variables (see the OpenCode documentation). Follow these steps to configure Google Vertex AI authentication:

    1. On your local machine, run the following command (This opens a browser for authentication and saves credentials to ~/.config/gcloud/application_default_credentials.json):

      gcloud auth application-default login
      gcloud auth application-default set-quota-project cloudability-it-gemini
    2. Create a Kubernetes secret from application_default_credentials.json (ensure you have appropriate permissions in the namespace):

      kubectl create secret generic gcloud-adc   -n <your-namespace>   --from-file=adc.json=$HOME/.config/gcloud/application_default_credentials.json
    3. Label and annotate the secret so that OpenShift Dev Spaces mounts it automatically:

      kubectl label secret gcloud-adc -n <your-namespace>   \
          controller.devfile.io/mount-to-devworkspace=true  \
          controller.devfile.io/watch-secret=true   --overwrite
      kubectl annotate secret gcloud-adc -n <your-namespace> \
          controller.devfile.io/mount-path=/credentials  \
          controller.devfile.io/mount-as=file  --overwrite
    4. Create a ConfigMap to mount environment variables automatically for your workspaces:

      kubectl create configmap ai-tools-env  -n <your-namespace>             \
        --from-literal=GOOGLE_APPLICATION_CREDENTIALS=/credentials/adc.json  \
        --from-literal=GOOGLE_CLOUD_PROJECT=<your-gcp-project-name>          \
        --dry-run=client -o yaml | kubectl apply -f -
      kubectl label configmap ai-tools-env -n <your-namespace>     \
        controller.devfile.io/mount-to-devworkspace=true           \
        controller.devfile.io/watch-configmap=true  --overwrite
      kubectl annotate configmap ai-tools-env -n <your-namespace>   \
        controller.devfile.io/mount-as=env --overwrite

    Interact with OpenCode

    Once you configure the model, you can issue commands to OpenCode. Figure 9 shows an example prompt for reviewing changes from the last commit.

    1. Figure 8: Giving sample prompt to OpenCode to analyze changes done in previous commit
      Figure 8: Giving sample prompt to OpenCode to analyze changes done in previous commit
      Figure 9: Giving a sample prompt to OpenCode to analyze changes from the previous commit.

    Conclusion

    This blog post explored how the OpenCode CLI supports a wide range of model providers for AI-powered development. By integrating it with OpenShift Dev Spaces, you can use a single CLI to access multiple model providers or run on-premises models in your development environment.

    This setup gives teams the flexibility to choose the right model for each task while maintaining control over their infrastructure, costs, and data. Whether you use cloud-based providers or local LLMs, OpenCode enables a consistent, portable, AI-assisted development experience.

    If you want to learn more, check out:

    • Red Hat OpenShift Dev Spaces
    • OpenCode CLI
    • Eclipse Che
    • OpenCode GitHub repository

    Related Posts

    • Build a CI/CD pipeline with OpenShift Dev Spaces and GitOps

    • A guide to AI code assistants with Red Hat OpenShift Dev Spaces

    • Enterprise multi-cluster scalability with OpenShift Dev Spaces

    • Run privileged commands more securely in OpenShift Dev Spaces

    • How to run AI models in cloud development environments

    • The state of open source AI models in 2025

    Recent Posts

    • Red Hat Hardened Images: Top 5 benefits for software developers

    • How EvalHub manages two-layer Kubernetes control planes

    • Tekton joins the CNCF as an incubating project

    • Federated identity across the hybrid cloud using zero trust workload identity manager

    • Confidential virtual machine storage attack scenarios

    What’s up next?

    Learning Path OS_private AI_CDE_featured_image

    Integrate a private AI coding assistant into your CDE using Ollama, Continue, and...

    Learn how to set up a cloud development environment (CDE) using Ollama,...
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Chat Support

    Please log in with your Red Hat account to access chat support.