Dive deeper into large language models and Node.js

Explore how to use large language models (LLMs) with Node.js by observing Ollama, LlamaIndex, function calling, and agents.

Overview: Dive deeper into large language models and Node.js

AI and large language models (LLMs) are growing as important tools for web applications. As a JavaScript/Node.js developer, it's crucial to understand how to fit into this growing space. This learning path is a follow-up to How to get started with large language models and Node.js. If you have not already gone through that learning path, you might want to do that first. But it's not strictly necessary. 

In the first learning path, we looked at LangChain.js and retrieval-augmented generation (RAG) as well as how to easily switch your Node.js application to access an LLM running under Red Hat OpenShift AI. To avoid playing favorites, we will dive into OllamaLlamaIndex, and function calling, focusing on using them with JavaScript/Node.js. Similar to how LangChainOllama, and LlamaIndex support TypeScript/JavaScript as their second language after Python.

Prerequisites:

  • A GitHub account
  • A Git client
  • Node.js 18.x or later
  • A Linux, Windows, or macOS computer, optionally with GPU support.

In this learning path, you will:

  • Learn about Ollama.
  • Learn about LlamaIndex.ts and run a Node.js program. 
  • Run a Node.js application.
  • Generate traces from a Node.js program built with LlamaIndex.ts.