Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 11 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,11 +103,17 @@ This project requires a Redis-compatible key-value store. You'll need to:
```

#### c. Database Setup
1. Apply the database migrations manually:
- Navigate to the `/migrations` folder in the project
- Execute each SQL migration file in sequential order against your Supabase database
- You can run these migrations using the Supabase SQL Editor or a PostgreSQL client
- Make sure to apply migrations in the correct order based on their timestamp prefixes
Apply the database migrations:
```bash
# Using Bun (recommended)
source .env.local && bun run db:migrations:apply

# Using npm
source .env.local && npm run db:migrations:apply
```
This runs all SQL files in `/migrations` in timestamp order. Already-applied migrations are automatically skipped.

Alternatively, you can apply migrations manually using the Supabase SQL Editor or a PostgreSQL client — just execute each file in `/migrations` in sequential order.

#### d. Supabase Storage Setup
1. Go to Storage > Buckets
Expand Down
1 change: 1 addition & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
"scripts:check-e2e-env": "bun scripts/check-e2e-env.ts",
"scripts:check-all-env": "bun scripts:check-app-env && bun scripts:check-e2e-env",
"scripts:create-migration": "bun scripts/create-migration.ts",
"scripts:apply-migrations": "bun scripts/apply-migrations.ts",
"<<<<<<< Development": "",
"shad": "bunx shadcn@canary",
"test:dev:traffic": "vitest run src/__test__/development/traffic.test.ts",
Expand Down
48 changes: 48 additions & 0 deletions scripts/apply-migrations.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
import postgres from 'postgres'
import { readdirSync, readFileSync } from 'fs'
import { join } from 'path'

const connectionString = process.env.POSTGRES_URL_NON_POOLING

if (!connectionString) {
console.error('❌ POSTGRES_URL_NON_POOLING is not set')
process.exit(1)
}

const sql = postgres(connectionString)

async function applyMigrations() {
const migrationsDir = join(process.cwd(), 'migrations')
const files = readdirSync(migrationsDir)
.filter((f) => f.endsWith('.sql'))
.sort()

console.log(`Found ${files.length} migration(s) to apply:\n`)

for (const file of files) {
const filePath = join(migrationsDir, file)
const content = readFileSync(filePath, 'utf-8')

console.log(`⏳ Applying ${file}...`)
try {
await sql.unsafe(content)
console.log(`✅ Applied ${file}`)
} catch (err: any) {
if (
err.message.includes('already exists') ||
err.message.includes('duplicate')
) {
Comment on lines +31 to +34

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Treating any “duplicate” error as already-applied migration

The script skips a migration whenever the error message contains “already exists” or “duplicate”. That’s broader than “schema already applied”: data migrations can legitimately fail with duplicate key value violates unique constraint when inserting rows, or unique-index creation can fail due to real data conflicts. In those cases the migration has not been applied, but this code will log it as skipped and continue, leaving the DB in a partially migrated state without surfacing the failure. Consider using a dedicated migrations table or only skipping known idempotent DDL errors tied to the specific migration.

Useful? React with 👍 / 👎.

console.log(`⏭️ Skipped ${file} (already applied)`)
} else {
console.error(`❌ Failed on ${file}: ${err.message}`)
await sql.end()
process.exit(1)
}
}
}

console.log('\n🎉 All migrations applied successfully!')
await sql.end()
}

applyMigrations()