Skip to main content
Redhat Developers  Logo
  • Products

    Platforms

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat AI
      Red Hat AI
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • See all Red Hat products

    Featured

    • Red Hat build of OpenJDK
    • Red Hat Developer Hub
    • Red Hat JBoss Enterprise Application Platform
    • Red Hat OpenShift Dev Spaces
    • Red Hat OpenShift Local
    • Red Hat Developer Sandbox

      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Red Hat OpenShift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • See all technologies
    • Programming languages & frameworks

      • Java
      • Python
      • JavaScript
    • System design & architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer experience

      • Productivity
      • Tools
      • GitOps
    • Automated data processing

      • AI/ML
      • Data science
      • Apache Kafka on Kubernetes
    • Platform engineering

      • DevOps
      • DevSecOps
      • Red Hat Ansible Automation Platform for applications and services
    • Secure development & architectures

      • Security
      • Secure coding
  • Learn

    Featured

    • Kubernetes & cloud native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • AI/ML
      AI/ML Icon
    • See all learning resources

    E-books

    • GitOps cookbook
    • Podman in action
    • Kubernetes operators
    • The path to GitOps
    • See all e-books

    Cheat sheets

    • Linux commands
    • Bash commands
    • Git
    • systemd commands
    • See all cheat sheets

    Documentation

    • Product documentation
    • API catalog
    • Legacy documentation
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore the Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

OpenCode: A model-neutral AI coding assistant for OpenShift Dev Spaces

April 22, 2026
Rohan Kumar
Related topics:
Artificial intelligenceDeveloper toolsIDEs
Related products:
Developer SandboxRed Hat OpenShift Dev Spaces

    AI coding assistants have shifted from novelty to necessity, changing how developers work. However, most current tools come with a hidden constraint: they are tightly coupled to a primary model provider.

    What if your development environment could switch between GPT, Claude, Gemini, or even local models without changing tools?

    That's where OpenCode comes in. When combined with cloud development platforms like Red Hat OpenShift Dev Spaces (based on Eclipse Che), it unlocks a flexible and effective new way to build with AI.

    OpenCode: A model-neutral AI coding assistant

    OpenCode CLI is part of a new wave of agentic tools designed to act as intelligent coding assistants. It is available as a terminal interface, desktop application, and IDE extension, making it flexible for different development workflows.

    While it sits alongside tools like Codex CLI, Claude Code, and Gemini CLI, OpenCode CLI takes a fundamentally different approach by being model-neutral. Instead of being tied to a single provider, OpenCode supports a wide range of providers (over 75 as per current documentation), including OpenAI, Anthropic Claude, Google Gemini, and local large language models (LLMs) via Ollama.

    This allows developers to switch models on demand, compare outputs, avoid vendor lock-in, and even run fully offline when configured with local models (for example, Ollama), subject to hardware constraints.

    How OpenCode differs from other CLI agents

    While it might seem like just another agentic CLI, OpenCode distinguishes itself by supporting a wide range of model providers. The following table shows a high-level comparison of OpenCode CLI versus other agentic CLIs.

    CapabilityOpenCode CLITypical agentic CLIs
    LicenseOpen source (MIT)Proprietary
    Model supportMulti-model (75+)Single provider
    Vendor lock-inNoYes
    Local model supportYesLimited or none
    Cost optimizationFlexibleRestricted
    Air-gapped compatibilityPossibleRare
    Enterprise flexibilityHighModerate to low
    Workflow changes requiredMinimalOften required

    Run OpenCode in Red Hat OpenShift Dev Spaces

    We will run OpenCode inside our cloud development environment, powered by OpenShift Dev Spaces, as illustrated in Figure 1. This setup lets developers use AI assistance in their workspace and switch models during development without interruption.

    Figure 1 : Using OpenCode inside Red Hat OpenShift Devspaces to connect to on prem or cloud hosted LLM providers
    Figure 1 : Using OpenCode inside Red Hat OpenShift Devspaces to connect to on prem or cloud hosted LLM providers
    Figure 1: Using OpenCode inside Red Hat OpenShift Dev Spaces to connect to on-premises or cloud-hosted LLM providers.

    How to set up your environment

    Before you begin, ensure you have a Developer Sandbox account. After you create an account, you should be able to access OpenShift Dev Spaces. 

    Follow these steps to set up your cloud development environment:

    1. To access Red Hat OpenShift Dev Spaces, navigate to the user dashboard (Figure 2).

      Figure 2: Red Hat OpenShift Dev Spaces User Dashboard
      Figure 2: Red Hat OpenShift Dev Spaces User Dashboard
      Figure 2: Red Hat OpenShift Dev Spaces user dashboard.
    2. To access OpenCode, ensure it is available in your development environment. We will use this repository, which contains the OpenCode CLI and other AI tools.
    3. In the user dashboard, go to the Create Workspace tab and enter the repository URL for this activity: https://github.com/che-incubator/cli-ai-tools/. Then select Create & Open (Figure 3).

      Figure 3: Starting Cloud Development Environment from GitHub URL
      Figure 3: Starting Cloud Development Environment from GitHub URL
      Figure 3: Starting the cloud development environment from the GitHub URL.
    4. During initialization, you might see a prompt asking if you trust the authors of the files in this workspace (Figure 4). To proceed, select Yes, I trust the authors.

      Figure 4: Visual Studio Code - Open Source ("Code - OSS") Warning Pop-Up
      Figure 4: Visual Studio Code - Open Source ("Code - OSS") Warning Pop-Up
      Figure 4: VS Code - Open Source ("Code - OSS") warning pop-up.
    5. On the left sidebar menu, choose Terminal and open it (Figure 5).

      Figure 5 : Opening terminal in Red Hat OpenShift DevSpaces Cloud IDE
      Figure 5 : Opening terminal in Red Hat OpenShift DevSpaces Cloud IDE
      Figure 5: Opening the terminal in the OpenShift Dev Spaces cloud IDE.
    6. When the terminal comes up, type opencode and wait for OpenCode to launch, as shown in Figure 6.
    7. Figure 5 : OpenCode launched in terminal in cloud development environment
      Figure 5 : OpenCode launched in terminal in cloud development environment
      Figure 6: OpenCode launched in the terminal.

    Install OpenCode from the VS Code Marketplace

    Alternatively, you can install the OpenCode extension from the VS Code Marketplace (Figure 7). This still requires you to have the opencode binary in your environment.

    Figure 6: OpenCode extension in VSCode marketplace
    Figure 6: OpenCode extension in VSCode marketplace
    Figure 7: OpenCode extension in the VS Code Marketplace.

    Once OpenCode is ready, it usually defaults to OpenCode Zen. Most popular model providers are preloaded. You can add the credentials for the model provider through the /connect command. Figure 8 shows the configuration for the Google Gemini model.

    Figure 7: Choosing Large Language Model provider via OpenCode /connect command
    Figure 7: Choosing Large Language Model provider via OpenCode /connect command
    Figure 8: Choosing a large language model provider using the OpenCode /connect command.

    Configure authentication with environment variables

    You can also preconfigure credentials using environment variables (see the OpenCode documentation). Follow these steps to configure Google Vertex AI authentication:

    1. On your local machine, run the following command (This opens a browser for authentication and saves credentials to ~/.config/gcloud/application_default_credentials.json):

      gcloud auth application-default login
      gcloud auth application-default set-quota-project cloudability-it-gemini
    2. Create a Kubernetes secret from application_default_credentials.json (ensure you have appropriate permissions in the namespace):

      kubectl create secret generic gcloud-adc   -n <your-namespace>   --from-file=adc.json=$HOME/.config/gcloud/application_default_credentials.json
    3. Label and annotate the secret so that OpenShift Dev Spaces mounts it automatically:

      kubectl label secret gcloud-adc -n <your-namespace>   \
          controller.devfile.io/mount-to-devworkspace=true  \
          controller.devfile.io/watch-secret=true   --overwrite
      kubectl annotate secret gcloud-adc -n <your-namespace> \
          controller.devfile.io/mount-path=/credentials  \
          controller.devfile.io/mount-as=file  --overwrite
    4. Create a ConfigMap to mount environment variables automatically for your workspaces:

      kubectl create configmap ai-tools-env  -n <your-namespace>             \
        --from-literal=GOOGLE_APPLICATION_CREDENTIALS=/credentials/adc.json  \
        --from-literal=GOOGLE_CLOUD_PROJECT=<your-gcp-project-name>          \
        --dry-run=client -o yaml | kubectl apply -f -
      kubectl label configmap ai-tools-env -n <your-namespace>     \
        controller.devfile.io/mount-to-devworkspace=true           \
        controller.devfile.io/watch-configmap=true  --overwrite
      kubectl annotate configmap ai-tools-env -n <your-namespace>   \
        controller.devfile.io/mount-as=env --overwrite

    Interact with OpenCode

    Once you configure the model, you can issue commands to OpenCode. Figure 9 shows an example prompt for reviewing changes from the last commit.

    1. Figure 8: Giving sample prompt to OpenCode to analyze changes done in previous commit
      Figure 8: Giving sample prompt to OpenCode to analyze changes done in previous commit
      Figure 9: Giving a sample prompt to OpenCode to analyze changes from the previous commit.

    Conclusion

    This blog post explored how the OpenCode CLI supports a wide range of model providers for AI-powered development. By integrating it with OpenShift Dev Spaces, you can use a single CLI to access multiple model providers or run on-premises models in your development environment.

    This setup gives teams the flexibility to choose the right model for each task while maintaining control over their infrastructure, costs, and data. Whether you use cloud-based providers or local LLMs, OpenCode enables a consistent, portable, AI-assisted development experience.

    If you want to learn more, check out:

    • Red Hat OpenShift Dev Spaces
    • OpenCode CLI
    • Eclipse Che
    • OpenCode GitHub repository

    Related Posts

    • Build a CI/CD pipeline with OpenShift Dev Spaces and GitOps

    • A guide to AI code assistants with Red Hat OpenShift Dev Spaces

    • Enterprise multi-cluster scalability with OpenShift Dev Spaces

    • Run privileged commands more securely in OpenShift Dev Spaces

    • How to run AI models in cloud development environments

    • The state of open source AI models in 2025

    Recent Posts

    • How we rewrote a production UI without stopping it

    • OpenCode: A model-neutral AI coding assistant for OpenShift Dev Spaces

    • Combining KServe and llm-d for optimized generative AI inference

    • AI-powered documentation updates: From code diff to docs PR in one comment

    • 3 lessons for building reliable ServiceNow AI integrations

    What’s up next?

    Learning Path OS_private AI_CDE_featured_image

    Integrate a private AI coding assistant into your CDE using Ollama, Continue, and...

    Learn how to set up a cloud development environment (CDE) using Ollama,...
    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Platforms

    • Red Hat AI
    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform
    • See all products

    Build

    • Developer Sandbox
    • Developer tools
    • Interactive tutorials
    • API catalog

    Quicklinks

    • Learning resources
    • E-books
    • Cheat sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site status dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2026 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue