Skip to content

ax-llm/ax

Ax: Build Reliable AI Apps in TypeScript

Stop wrestling with prompts. Start shipping AI features.

Ax brings DSPy's revolutionary approach to TypeScript – just describe what you want, and let the framework handle the rest. Production-ready, type-safe, and works with all major LLMs.

NPM Package Twitter Discord Chat

Transform Your AI Development in 30 Seconds

import { ai, ax } from "@ax-llm/ax";

// 1. Pick any LLM
const llm = ai({ name: "openai", apiKey: process.env.OPENAI_APIKEY! });

// 2. Say what you want
const classifier = ax(
  'review:string -> sentiment:class "positive, negative, neutral"',
);

// 3. Get type-safe results
const result = await classifier.forward(llm, {
  review: "This product is amazing!",
});
console.log(result.sentiment); // "positive" ✨

That's it. No prompt engineering. No trial and error. It works with GPT-4, Claude, Gemini, or any LLM.

Why Thousands of Developers Choose Ax

🎯 Define Once, Run Anywhere

Write your logic once. Switch between OpenAI, Anthropic, Google, or 15+ providers with one line. No rewrites needed.

Ship 10x Faster

Stop tweaking prompts. Define inputs → outputs. The framework generates optimal prompts automatically.

🛡️ Production-Ready from Day One

Built-in streaming, validation, error handling, observability. Used by startups in production handling millions of requests.

🚀 Gets Smarter Over Time

Train your programs with examples. Watch accuracy improve automatically. No ML expertise needed.

Real Apps, Real Simple

Extract Structured Data from Customer Emails

const llm = ai({ name: "openai", apiKey: process.env.OPENAI_APIKEY! });

const extractor = ax(`
  customerEmail:string, currentDate:datetime -> 
  priority:class "high, normal, low",
  sentiment:class "positive, negative, neutral",
  ticketNumber?:number,
  nextSteps:string[],
  estimatedResponseTime:string
`);

const result = await extractor.forward(llm, {
  customerEmail: "Order #12345 hasn't arrived. Need this resolved immediately!",
  currentDate: new Date(),
});
// Automatically extracts all fields with proper types and validation

Complex Structured Outputs (New!)

Define deeply nested objects with full type safety using the fluent API:

import { f, ax } from "@ax-llm/ax";

const productExtractor = f()
  .input("productPage", f.string())
  .output("product", f.object({
    name: f.string(),
    price: f.number(),
    specs: f.object({
      dimensions: f.object({
        width: f.number(),
        height: f.number()
      }),
      materials: f.array(f.string())
    }),
    reviews: f.array(f.object({
      rating: f.number(),
      comment: f.string()
    }))
  }))
  .build();

const generator = ax(productExtractor);
const result = await generator.forward(llm, { productPage: "..." });

// Full TypeScript inference for nested fields
console.log(result.product.specs.dimensions.width); // number
console.log(result.product.reviews[0].comment);     // string

Validation & Constraints (New!)

Add Zod-like validation constraints to ensure data quality and format:

import { f, ax } from "@ax-llm/ax";

const userRegistration = f()
  .input("userData", f.string())
  .output("user", f.object({
    username: f.string().min(3).max(20),
    email: f.string().email(),
    age: f.number().min(18).max(120),
    password: f.string().min(8).regex("^(?=.*[A-Za-z])(?=.*\\d)", "Must contain at least one letter and one digit"),
    bio: f.string().max(500).optional(),
    website: f.string().url().optional(),
    tags: f.string().min(2).max(30).array()
  }))
  .build();

const generator = ax(userRegistration);
const result = await generator.forward(llm, {
  userData: "Name: John, Email: john@example.com, Age: 25..."
});

// All fields are automatically validated:
// - username: 3-20 characters
// - email: valid email format
// - age: between 18-120
// - password: min 8 chars with letter and number
// - website: valid URL format if provided
// - tags: each 2-30 characters

Available Constraints:

  • .min(n) / .max(n) - String length or number range
  • .email() - Email format validation (or use f.email())
  • .url() - URL format validation (or use f.url())
  • .date() - Date format validation (or use f.date())
  • .datetime() - Datetime format validation (or use f.datetime())
  • .regex(pattern, description) - Custom regex pattern with human-readable description
  • .optional() - Make field optional

Note: For email, url, date, and datetime, you can use either the validator syntax (f.string().email()) or the dedicated type syntax (f.email()). Both work consistently in all contexts!

Automatic Features:

  • ✅ Input validation before sending to LLM
  • ✅ Output validation after LLM response
  • ✅ JSON Schema constraints in structured outputs
  • ✅ Automatic retry with corrections on validation errors
  • ✅ TypeScript compile-time protection

Build Agents That Use Tools (ReAct Pattern)

const llm = ai({ name: "openai", apiKey: process.env.OPENAI_APIKEY! });

const assistant = ax(
  "question:string -> answer:string",
  {
    functions: [
      { name: "getCurrentWeather", func: weatherAPI },
      { name: "searchNews", func: newsAPI },
    ],
  },
);

const result = await assistant.forward(llm, {
  question: "What's the weather in Tokyo and any news about it?",
});
// AI automatically calls both functions and combines results

Multi-Modal Analysis with Images

const analyzer = ax(`
  image:image, question:string ->
  description:string,
  mainColors:string[],
  category:class "electronics, clothing, food, other",
  estimatedPrice:string
`);
// Process images and text together seamlessly

Quick Start

Install

npm install @ax-llm/ax

Your First AI Feature (2 minutes)

import { ai, ax } from "@ax-llm/ax";

const llm = ai({ name: "openai", apiKey: process.env.OPENAI_APIKEY! });

const translator = ax(`
  text:string, 
  language:string -> 
  translation:string
`);

const result = await translator.forward(llm, {
  text: "Hello world",
  language: "Spanish",
});
console.log(result.translation); // "Hola mundo"

Fluent Signature API

import { ai, ax, f } from "@ax-llm/ax";

const llm = ai({ name: "openai", apiKey: process.env.OPENAI_APIKEY! });

const signature = f()
  .input("userQuestion", f.string("User question"))
  .output("responseText", f.string("AI response"))
  .output("confidenceScore", f.number("Confidence 0-1"))
  .build();

const generator = ax(signature.toString());
const result = await generator.forward(llm, { userQuestion: "What is Ax?" });
console.log(result.responseText, result.confidenceScore);

Powerful Features, Zero Complexity

  • 15+ LLM Providers - OpenAI, Anthropic, Google, Mistral, Ollama, and more
  • Type-Safe Everything - Full TypeScript support with auto-completion
  • Streaming First - Real-time responses with validation
  • Multi-Modal - Images, audio, text in the same signature
  • Smart Optimization - Automatic prompt tuning with MiPRO
  • Agentic Context Engineering - ACE generator → reflector → curator loops
  • Multi-Objective Optimization - GEPA and GEPA-Flow (Pareto frontier)
  • Production Observability - OpenTelemetry tracing built-in
  • Advanced Workflows - Compose complex pipelines with AxFlow
  • Enterprise RAG - Multi-hop retrieval with quality loops
  • Agent Framework - Agents that can use tools and call other agents
  • Zero Dependencies - Lightweight, fast, reliable

Learn More

🚀 Quick Wins

📚 Deep Dives

Examples

Run any example:

OPENAI_APIKEY=your-key npm run tsx ./src/examples/[example-name].ts

Core Examples

Production Patterns

📚 View Full Examples Guide
View All 70+ Examples →

Join the Community

Production Ready

  • Battle-tested - Used by startups in production
  • No breaking changes - Stable minor versions
  • Comprehensive tests - Large test coverage
  • OpenTelemetry - Built-in observability
  • TypeScript first - Type-safe by design

Contributors

License

Apache 2 - Use it anywhere, build anything.


Ready to build the future? Stop fighting with prompts. Start shipping with signatures.

npm install @ax-llm/ax

Built with ❤️ by developers, for developers.