Skip to content

Conversation

@ashshidiq23
Copy link

Benchmark Comparison

Metric Before Optimization After Optimization Improvement
Execution Time 1376.9 sec (22.95 min) 58.8 sec (~1 min) 23.4× faster (-95.7%)
Memory Usage 626 MiB 30 MiB 20.9× lower (-95.2%)

Before:
Screenshot 2025-03-28 113145

After:
Screenshot 2025-04-04 171802

Optimization Process

Identifying Bottlenecks

Initially, the dropout process fetched all enrollments into memory (->get()), then iterating one by one to check conditions and update records.
This approach has resulted in:

  • Load too much data into memory at once.
  • Performed excessive queries, making the process inefficient.
  • Updated enrollments individually, increasing query overhead.

Optimization Strategies

To improve performance, I implemented the following optimizations:

  1. Joining Tables and Filter Data in SQL
    Moved filtering logic using the database query using LEFT JOIN with NULL checks to avoids unnecessary PHP-level processing.

  2. Batch Processing with chunkById
    Instead of loading all data into memory, I used chunkById(), reduced memory consumption and improving efficiency.

  3. Bulk Updates & Inserts
    Used whereIn('id', $ids)->update() to update enrollments in bulk rather than by foreach every data.

  4. Inserted activity logs using Activity::insert($activities), reducing query execution time.

  5. Moving now() calls to variable $now to avoid repeated call and improving updated_at date consistency in every batch

  6. Garbage Collection (gc_collect_cycles())
    Forced garbage collection after each batch to free up memory.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants