Rohan Kumar's contributions
Article
OpenCode: A model-neutral AI coding assistant for OpenShift Dev Spaces
Rohan Kumar
Discover OpenCode, a model-neutral AI coding assistant that supports over 75 providers, including OpenAI, Anthropic Claude, Google Gemini, and local large language models (LLMs) via Ollama. Switch models on demand, compare outputs, avoid vendor lock-in, and even run fully offline with local models. Learn how to set up your environment in Red Hat OpenShift Dev Spaces.
Learning path
How to run AI models in cloud development environments
Rohan Kumar
This learning path explores running AI models, specifically large language
Article
How to run AI models in cloud development environments
Rohan Kumar
Explore using RamaLama to run private AI inference in cloud development environments and improve productivity. Follow this tutorial to get started.
Article
How to generate code using Fabric8 Kubernetes Client
Rohan Kumar
Learn how to generate code using tools provided by Fabric8 Kubernetes Client, including Fabric8 CRD Generator and Fabric8 Java Generator. (Part 4 of 5)
Article
How to write tests with Fabric8 Kubernetes Client
Rohan Kumar
Learn how to effectively write tests for Kubernetes applications in Java using Fabric8 Kubernetes Mock Server and Fabric8 Kubernetes JUnit Jupiter extensions.
Article
How to use Kubernetes dynamic client with Fabric8
Rohan Kumar
Learn how to use Fabric8 Kubernetes dynamic client and easily interact with Kubernetes API using GenericKubernetesResource API. (Part 3 of 5)
Article
Programming Kubernetes custom resources in Java
Rohan Kumar
Learn how to interact with Kubernetes custom resources using its REST API in Java and the Fabric8 Kubernetes Java client. (Part 2 of 5)
Article
How to use Fabric8 Java Client with Kubernetes
Rohan Kumar
Learn how to program Kubernetes REST API in Java using Fabric8 Kubernetes client in this quick demonstration. (Part 1 of 5)
OpenCode: A model-neutral AI coding assistant for OpenShift Dev Spaces
Discover OpenCode, a model-neutral AI coding assistant that supports over 75 providers, including OpenAI, Anthropic Claude, Google Gemini, and local large language models (LLMs) via Ollama. Switch models on demand, compare outputs, avoid vendor lock-in, and even run fully offline with local models. Learn how to set up your environment in Red Hat OpenShift Dev Spaces.
How to run AI models in cloud development environments
This learning path explores running AI models, specifically large language
How to run AI models in cloud development environments
Explore using RamaLama to run private AI inference in cloud development environments and improve productivity. Follow this tutorial to get started.
How to generate code using Fabric8 Kubernetes Client
Learn how to generate code using tools provided by Fabric8 Kubernetes Client, including Fabric8 CRD Generator and Fabric8 Java Generator. (Part 4 of 5)
How to write tests with Fabric8 Kubernetes Client
Learn how to effectively write tests for Kubernetes applications in Java using Fabric8 Kubernetes Mock Server and Fabric8 Kubernetes JUnit Jupiter extensions.
How to use Kubernetes dynamic client with Fabric8
Learn how to use Fabric8 Kubernetes dynamic client and easily interact with Kubernetes API using GenericKubernetesResource API. (Part 3 of 5)
Programming Kubernetes custom resources in Java
Learn how to interact with Kubernetes custom resources using its REST API in Java and the Fabric8 Kubernetes Java client. (Part 2 of 5)
How to use Fabric8 Java Client with Kubernetes
Learn how to program Kubernetes REST API in Java using Fabric8 Kubernetes client in this quick demonstration. (Part 1 of 5)