Demystify AI agents by building them yourself. Local LLMs, no black boxes, real understanding of function calling, memory, and ReAct patterns.
-
Updated
Nov 28, 2025 - JavaScript
Demystify AI agents by building them yourself. Local LLMs, no black boxes, real understanding of function calling, memory, and ReAct patterns.
Demystify RAG by building it from scratch. Local LLMs, no black boxes - real understanding of embeddings, vector search, retrieval, and context-augmented generation.
run llms and slms on your hardware & browser
The friendly and powerful desktop AI chatbot supporting both local and cloud AI models
A comprehensive Next.js application for running and exploring .gguf open-source LLM models locally.
Add a description, image, and links to the node-llama-cpp topic page so that developers can more easily learn about it.
To associate your repository with the node-llama-cpp topic, visit your repo's landing page and select "manage topics."