What are the principles we can use to build LLM-powered software that is actually good enough to put in the hands of production customers?
-
Updated
Sep 21, 2025 - TypeScript
What are the principles we can use to build LLM-powered software that is actually good enough to put in the hands of production customers?
The Context Optimization Layer for LLM Applications
[ICML'24 Spotlight] LLM Maybe LongLM: Self-Extend LLM Context Window Without Tuning
Give Claude Code photographic memory in ONE portable file. No database, no SQLite, no ChromaDB - just a single .mv2 file you can git commit, scp, or share. Native Rust core with sub-ms operations.
Open Source Context infrastructure for AI agents. Auto-capture and share your agents' context everywhere.
Supercharge AI Agents, Safely
Config-driven CLI tool that compresses command output before it reaches an LLM context
Find the ghost tokens. Fix them. Survive compaction. Avoid context quality decay.
A discovery and compression tool for your Python codebase. Creates a knowledge graph for a LLM context window, efficiently outlining your project | Code structure visualization | LLM Context Window Efficiency | Static analysis for AI | Large Language Model tooling #LLM #AI #Python #CodeAnalysis #ContextWindow #DeveloperTools
Make your OpenClaw AI agent faster, smarter, and cheaper. Speed optimization, memory architecture, context management, model selection, and one-shot development guide.
A local-first memory layer for AI (Cursor, Zed, Claude). Persistent architectural context via semantic search.
Documentation snippets for LLM context injection
Transform and optimize your markdown documentation for Large Language Models (LLMs) and RAG systems. Generate llms.txt automatically.
The Context Engineering Engine. Your AI sees 5% of your codebase — Entroly shows it everything. 78% fewer tokens. Works with Cursor, Claude Code, Copilot, OpenClaw.
Building Agents with LLM structured generation (BAML), MCP Tools, and 12-Factor Agents principles
Optimize AI workflows with Arachne. Automatically assembles the perfect code context (Tree, Target, Deps, Semantic) to fit context windows without noise. Built for efficiency and scale.
Grab, filter, and bundle your codebase for Claude and ChatGPT right from your terminal.
A lightweight tool to optimize your C# project for LLM context windows by using a knowledge graph | Code structure visualization | Static analysis for AI | Large Language Model tooling | .NET ecosystem support #LLM #AI #CSharp #DotNet #CodeAnalysis #ContextWindow #DeveloperTools
[ICLR 2025] Official code repository for "TULIP: Token-length Upgraded CLIP"
High-performance repository context generator for LLMs - Transform codebases into optimized formats for Claude, GPT-4/5, Gemini, and other LLMs
Add a description, image, and links to the context-window topic page so that developers can more easily learn about it.
To associate your repository with the context-window topic, visit your repo's landing page and select "manage topics."