TensorZero is an open-source stack for industrial-grade LLM applications. It unifies an LLM gateway, observability, optimization, evaluation, and experimentation.
-
Updated
Feb 27, 2026 - Rust
TensorZero is an open-source stack for industrial-grade LLM applications. It unifies an LLM gateway, observability, optimization, evaluation, and experimentation.
A high-performance LLM inference API and Chat UI that integrates DeepSeek R1's CoT reasoning traces with Anthropic Claude models.
Every Code - push frontier AI to it limits. A fork of the Codex CLI with validation, automation, browser integration, multi-agents, theming, and much more. Orchestrate agents from OpenAI, Claude, Gemini or any provider.
CLI proxy that reduces LLM token consumption by 60-90% on common dev commands. Single Rust binary, zero dependencies
Debug your AI agents
Desktop app to control your computer with AI using your terminal, browser, mouse & keyboard
High-performance AI routing proxy built in Rust with automatic failover, priority-based routing, and support for 15+ providers (Anthropic, OpenAI, Cerebras, Minimax, Kimi, etc.)
Smart Tree: not just a tree, a philosophy. A context-aware, AI-crafted replacement for 20+ tools with MEM8 quantum compression, semantic search, AST-smart editing, and partnership memory. Crafted with care by human + AI—accept no knock-offs.
Use multiple LLM backends in a single crate, simple builder-based configuration, and built-in prompt chaining & templating.
Your AI coworker for any folder: local-first, secure by design, cross-platform, and built for supervised automation.
Shepherd Model Gateway
Ultra-fast token & cost tracker for LLM Token Usage (e.g. Claude Code)
Add a description, image, and links to the anthropic topic page so that developers can more easily learn about it.
To associate your repository with the anthropic topic, visit your repo's landing page and select "manage topics."