Skip to content

LLM-powered search router that chooses direct answers or live web retrieval, returning concise, validated responses with optional source citations via a modular Express + LangChain backend.

Notifications You must be signed in to change notification settings

Arunkoo/SearchAgent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 

Repository files navigation

LCEL SearchAgent

A small Express + TypeScript service that answers questions using an LLM.
It automatically chooses between two paths:

  • Direct mode: respond from the model only (no browsing)
  • Web mode: search the web, open top pages, summarize them, then answer with source URLs

The goal is to keep answers short and beginner-friendly while returning structured JSON output.

App preview

What this API does

High Level Design--> App preview When you send a query to the API:

  1. The request is validated with Zod.
  2. A router step decides the mode:
    • Uses direct for normal / stable questions
    • Switches to web if the query looks time-sensitive, numeric, price-related, trending, “latest”, etc.
  3. Direct mode
    • Calls the chat model once
    • Returns { answer, sources: [] }
  4. Web mode
    • Uses Tavily Search to get results
    • Opens the top pages (HTTP fetch)
    • Converts HTML to text and caps content length
    • Summarizes each page with the model
    • Composes a final answer using only those summaries
    • Returns { answer, sources: [url, ...] }
  5. Final output is validated again with Zod, and repaired once if needed.

Tech stack

  • Node.js + TypeScript
  • Express
  • LangChain (RunnableSequence, RunnableBranch)
  • Zod (input/output validation)
  • Tavily Search API (web search)
  • html-to-text (cleaning page HTML)
  • CORS configured for a single allowed frontend origin

API

POST /api/search

Request body

{
  "q": "your question here"
}

Success Response

{
  "answer": "string",
  "sources": ["https://example.com", "https://example2.com"]
}

About

LLM-powered search router that chooses direct answers or live web retrieval, returning concise, validated responses with optional source citations via a modular Express + LangChain backend.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published