Self-hosted AI text detection — analyze written work for AI-generated content using local or cloud LLMs.
LLaMa Audit is an open-source tool that lets anyone host and run their own LLM-based text auditing system. It analyzes written text (essays, articles, reports) to detect sections likely generated by AI, providing per-paragraph probability scores, rationale, and highlighted results.
Under the hood, an AI prompt with a structured output schema tasks one or more LLMs with examining the provided text and answering:
- Which sections are likely to have been AI-generated?
- What is the probability of each section being AI-generated?
- What linguistic markers indicate AI authorship?
Results from multiple models are aggregated into an overall score with highlighted areas, helping students, educators, and writers understand how their writing may be fingerprinted by AI detection systems.
- 📝 Rich text editor — paste text or import Word (.docx) and OpenDocument (.odt) files
- 🔍 AI detection analysis — per-paragraph scoring with rationale and linguistic markers
- 📊 Visual results — overall score gauge, highlighted text, and section-by-section breakdown
- 🤖 Multi-model support — run analysis across multiple models and aggregate results
- 🌐 OpenRouter integration — use any model available on OpenRouter
- 🦙 Ollama support — run analysis using local models
- ⚙️ Settings panel — configure API keys, endpoints, and model selection in the UI
- 🔌 Connection testing — verify provider connectivity before running analysis
- 📋 Analysis history — review past analyses stored in PostgreSQL
- 🐳 Docker-first — everything runs in containers with hot reload for development
| Layer | Technology |
|---|---|
| Frontend | Next.js 14, React 18, Tailwind CSS, TypeScript |
| Backend | Express, Sequelize, PostgreSQL, TypeScript |
| AI Providers | OpenRouter API, Ollama |
| Infrastructure | Docker Compose, Node 20 Alpine |
- Docker and Docker Compose
- An OpenRouter API key (or a local Ollama instance)
git clone https://github.com/your-org/llamaudit.git
cd llamaudit
cp example.env .env # Edit with your API keysdocker compose --profile dev up -dServices will be available at:
- Frontend: http://localhost:52000
- Backend API: http://localhost:52001
Open http://localhost:52000/settings and:
- Select your AI provider (OpenRouter or Ollama)
- Enter your API key or endpoint
- Click "Test Connection" to verify
- Select your preferred models
- Save settings
Go to http://localhost:52000, paste or import your text, and click Run Analysis.
llamaudit/
├── backend/ # Express + Sequelize + TypeScript API
│ ├── src/
│ │ ├── config/ # Environment & database configuration
│ │ ├── models/ # Sequelize models (Analysis, Settings)
│ │ ├── routes/ # REST API endpoints
│ │ └── services/ # AI provider integrations (OpenRouter, Ollama)
│ └── tests/ # Jest unit & integration tests
├── frontend/ # Next.js 14 + Tailwind CSS
│ ├── src/
│ │ ├── app/ # Next.js app router pages
│ │ ├── components/# React components
│ │ └── types/ # TypeScript type definitions
│ └── tests/ # Jest component tests
├── docker-compose.yml # Docker orchestration (dev/test/prod profiles)
├── run-tests.sh # Dockerized test runner
└── example.env # Example environment variables (copy to .env)
| Method | Endpoint | Description |
|---|---|---|
GET |
/api/health |
Health check |
POST |
/api/analysis |
Run AI detection on text |
GET |
/api/analysis |
List analysis history |
GET |
/api/analysis/:id |
Get single analysis |
DELETE |
/api/analysis/:id |
Delete an analysis |
GET |
/api/settings |
Get current settings |
PUT |
/api/settings |
Update a setting |
PUT |
/api/settings/batch |
Batch update settings |
POST |
/api/settings/test-connection |
Test provider connectivity |
GET |
/api/settings/models |
List available models |
# Run all tests (ephemeral Docker containers)
./run-tests.sh
# View last test output
./run-tests.sh --last
# Run only backend tests
./run-tests.sh --backend
# Run only frontend tests
./run-tests.sh --frontend# Backend shell
docker compose --profile dev exec backend-dev sh
# Frontend shell
docker compose --profile dev exec frontend-dev sh
# Database CLI
docker compose --profile dev exec db psql -U llamaudit -d llamaudit
# View logs
docker compose --profile dev logs -f| Variable | Description | Default |
|---|---|---|
OPENROUTER_API_KEY |
OpenRouter API key | — |
OPENROUTER_API_URL |
OpenRouter API base URL | https://openrouter.ai/api/v1 |
OPENROUTER_CHAT_MODELS |
Comma-separated model list | qwen/qwen3-8b |
OLLAMA_ENDPOINT |
Ollama server URL | http://host.docker.internal:11434 |
OLLAMA_MODELS |
Comma-separated Ollama model list | — |
DATABASE_* |
PostgreSQL connection settings | See example.env |
FRONTEND_URL |
Frontend URL for CORS | http://localhost:52000 |
- Core text analysis with structured output
- OpenRouter integration with retry logic
- Ollama integration for local models
- Multi-model aggregation
- Settings panel with connection testing
- Document import (Word, ODT, Markdown)
- Analysis history with PostgreSQL storage
- Docker Compose dev/test infrastructure
- User authentication
- Batch analysis (multiple documents)
- PDF import support
- Export reports (PDF, HTML)
- Model comparison dashboard
- Webhook notifications
- Rate limiting and usage quotas
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes and add tests
- Run
./run-tests.shto verify - Submit a pull request
This project is licensed under the MIT License. See LICENSE for details.