NodeLLM introduces native support for the Model Context Protocol (MCP), providing a standardized boundary for discovering and executing tools, resources, and prompt templates across any compliant server.
NodeLLM 1.15 introduces automated schema self-correction, enabling agents to recover from validation failures without manual intervention. This release also brings fine-grained middleware lifecycle control and hardened type safety.
NodeLLM 1.14 reinforces our philosophy that agents don't require complex orchestration frameworks—they are just LLMs armed with tools. This release also brings first-class support for xAI (Grok) and the complete Mistral suite.
Vercel AI SDK is the industry standard for shipping fast. NodeLLM is a backend-first LLM runtime with middleware architecture, 540+ models across 7 providers, and enterprise-grade features.