Codexa is a comprehensive microservices-based competitive programming platform built with modern technologies. The platform provides code execution, problem management, user authentication, analytics, and AI-powered features.
This project follows a microservices architecture with the following services:
| Service | Port | Description |
|---|---|---|
| Auth Service | 3000 | User authentication, authorization, JWT token management, and user profile management |
| Utils Service | 3001 | Email notifications via Kafka consumer, file upload handling with Cloudinary integration |
| Problem Service | 3002 | CRUD operations for coding problems, test cases, and problem metadata |
| Code Service | - | Code execution engine with Judge0 integration, submission processing with BullMQ queues |
| AI Service | - | AI-powered features using Google Generative AI for hints and explanations |
| Analytics Service | - | User performance analytics, problem statistics, and rivalry tracking engine |
| DB Service | - | Shared database package with Prisma ORM for all services |
- Runtime: Node.js with TypeScript
- Framework: Express.js
- Database: PostgreSQL 15
- ORM: Prisma (v7.3.0)
- Message Queue: Apache Kafka (v3.7.0)
- Task Queue: BullMQ with Redis (for code execution)
- Authentication: JWT with refresh tokens, Argon2 for password hashing
- AI Integration: Google Generative AI
- File Storage: Cloudinary
- Email: Nodemailer with SMTP
- Containerization: Docker & Docker Compose
- Express: v5.2.1
- Prisma: v7.3.0
- Kafka: kafkajs v2.2.4
- BullMQ: v5.67.2
- TypeScript: v5.9.3
- Zod: v4.3.5+ (validation)
The platform uses PostgreSQL with the following main entities:
- User: User accounts with roles (USER, STUDENT, TEACHER, ADMIN), profiles, and statistics
- Problem: Coding problems with difficulty levels, test cases, tags, and constraints
- Submission: Code submissions with execution results and performance metrics
- UserAnalytics: Comprehensive user statistics including streaks, topic strengths, and activity logs
- TopicAttempt: Per-topic performance tracking for radar charts
- ProblemAnalytics: Global problem statistics and language-specific performance data
- Role: USER, STUDENT, TEACHER, ADMIN
- Difficulty: EASY, MEDIUM, HARD
- SubmissionStatus: PENDING, PROCESSING, ACCEPTED, WRONG_ANSWER, ERROR, TIME_LIMIT_EXCEEDED, MEMORY_LIMIT_EXCEEDED, COMPILATION_ERROR
- AccountStatus: ACTIVE, INACTIVE, SUSPENDED, DELETED
- Node.js (v18+ recommended)
- Docker & Docker Compose
- PostgreSQL 15
- Redis (for BullMQ)
- Apache Kafka
-
Clone the repository
git clone https://github.com/Rahul5977/Codexa-Server.git cd Codexa-Server -
Install dependencies
npm install
-
Set up environment variables
Each service has an
.env.examplefile. Copy and configure them:# For each service cp auth-service/.env.example auth-service/.env cp utils-service/.env.example utils-service/.env cp problem-service/.env.example problem-service/.env # Configure the .env files with your values
-
Set up the database
cd db-service npm run db:push npm run generate -
Start infrastructure services
# Start PostgreSQL and Kafka services defined in docker-compose.yml docker-compose up postgres kafka -d -
Run development servers
# Using npm workspaces npm run dev # Or run individual services npm start --workspace=auth-service npm start --workspace=problem-service npm start --workspace=utils-service
To run the entire stack with Docker:
docker-compose up -dThis will start:
- PostgreSQL database
- Kafka message broker
- Auth Service (port 3000)
- Utils Service (port 3001)
- Problem Service (port 3002)
- User registration and login
- JWT-based authentication (access + refresh tokens)
- Password hashing with Argon2
- User profile management with image uploads
- Email verification workflow
- Kafka integration for async operations
- Email notification consumer (Kafka)
- SMTP email sending with Nodemailer
- File upload handling
- Cloudinary integration for image storage
- CORS configuration
- Problem CRUD operations
- Test case management
- Problem filtering and search
- Tags and company associations
- Difficulty-based categorization
- Code submission processing
- Judge0 integration for code execution
- BullMQ queue management
- Multi-language support
- Execution time and memory tracking
- Google Generative AI integration
- AI-powered hints generation
- Code explanation and suggestions
- Problem-solving guidance
- User performance tracking
- Streak calculation
- Topic-wise strength analysis
- Activity heatmaps
- Rivalry engine
- Problem statistics aggregation
- Centralized Prisma schema
- Database migrations
- Shared across all services
- Type-safe database access
This project uses npm workspaces for monorepo management:
Codexa-Server/
βββ auth-service/ # Authentication & authorization service
βββ utils-service/ # Email & file upload utilities
βββ problem-service/ # Problem management service
βββ code-service/ # Code execution service
βββ ai-service/ # AI-powered features service
βββ analytics-service/ # Analytics & tracking service
βββ db-service/ # Shared database package
βββ docker-compose.yml # Docker orchestration
βββ package.json # Root workspace config
npm run dev- Start auth and code services concurrentlynpm start --workspace=<service-name>- Start a specific servicenpm run build --workspace=<service-name>- Build a specific service
cd db-service
npm run db:push # Push schema changes
npm run db:studio # Open Prisma Studio
npm run generate # Generate Prisma ClientKey environment variables needed:
DATABASE_URL- PostgreSQL connection stringJWT_ACCESS_SECRET- Secret for access tokensJWT_REFRESH_SECRET- Secret for refresh tokensKAFKA_BROKERS- Kafka broker addressesSMTP_*- Email configurationCLOUDINARY_*- Cloudinary credentialsCORS_ORIGIN- Allowed CORS origins
This is an active development project with the following completed features:
β
Microservices architecture setup
β
User authentication and authorization
β
Problem management system
β
Database schema design
β
Kafka-based event system
β
Docker containerization
β
Code execution pipeline (BullMQ)
β
Analytics and tracking system
β
AI integration for hints
β
Email notification system
This is a learning project. Contributions are welcome!
ISC