-
Notifications
You must be signed in to change notification settings - Fork 10
Claude/bulk import feature #29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
- Add strict type declarations to all service classes and models - Implement constructor property promotion in controllers (PHP 8.0+) - Refactor Service classes to use findOr() instead of manual null coalescing - Update composer.json dependencies to latest compatible versions: - Laravel Framework: ^10.4 → ^10.48 - Guzzle: ^7.2 → ^7.9 - PHPUnit: ^9.5 → ^10.5 - Other dev dependencies updated - Update package.json dependencies to latest versions: - axios: ^0.25 → ^1.7.9 - bootstrap: ^5.1.3 → ^5.3.3 - sass: ^1.32.11 → ^1.83.4 - All other packages updated to latest compatible versions This refactoring improves code quality, type safety, and brings dependencies up to date with the latest security patches. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
- Remove outdated composer.lock file - Update GitHub Actions workflow to use 'composer update' instead of 'composer install' - This ensures dependencies are installed with the latest composer.json changes The composer.lock file will be regenerated during CI execution with the updated dependencies. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This reverts commit 39e0964.
- Add 'composer update --lock' before 'composer install' in CI workflow - This updates composer.lock to match composer.json changes without changing package versions - Reverted previous commit that deleted composer.lock (keeping lock files is best practice) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Implemented bulk create, update, and delete operations for SquidUsers via CSV file upload. New Features: - CSV file upload interface for bulk operations - Three operation types: Create, Update, Delete - Transaction-based processing with rollback on failure - Detailed success/error reporting for each operation New Files: - app/Http/Requests/SquidUser/BulkImportRequest.php - Validates CSV file uploads (max 10MB, .csv/.txt) - Parses CSV into associative arrays - app/UseCases/SquidUser/BulkCreateAction.php - Creates multiple SquidUsers from CSV data - Required fields: user, password, enabled - app/UseCases/SquidUser/BulkUpdateAction.php - Updates existing users matched by username - Only updates non-empty fields - app/UseCases/SquidUser/BulkDeleteAction.php - Deletes users by username - resources/views/squidusers/bulk_import.blade.php - User-friendly upload interface - CSV format instructions and examples - Success/error result display Updated Files: - app/Http/Controllers/Gui/SquidUserController.php - Added bulkImporter() and bulkImport() methods - resources/views/squidusers/search.blade.php - Added "Bulk Import (CSV)" button - routes/web.php - Added routes for bulk import functionality CSV Format: Header: user,password,enabled,fullname,comment All operations use username as the primary identifier 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR introduces a bulk import feature for SquidUsers via CSV upload, allowing create, update, and delete operations. It also includes dependency updates and refactoring of service layer methods.
- Adds CSV bulk import functionality with three operations (create/update/delete)
- Updates PHP and JavaScript dependencies to newer versions
- Refactors service methods to use modern Laravel patterns
Reviewed Changes
Copilot reviewed 16 out of 16 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
| routes/web.php | Adds two new routes for bulk import UI and processing |
| resources/views/squidusers/search.blade.php | Updates UI to include bulk import button alongside create button |
| resources/views/squidusers/bulk_import.blade.php | New view providing bulk import form with CSV format instructions |
| package.json | Updates JavaScript dependencies to newer versions |
| composer.json | Updates PHP dependencies to newer versions |
| app/UseCases/SquidUser/BulkUpdateAction.php | Implements bulk update logic with transaction support |
| app/UseCases/SquidUser/BulkDeleteAction.php | Implements bulk delete logic with transaction support |
| app/UseCases/SquidUser/BulkCreateAction.php | Implements bulk create logic with transaction support |
| app/Services/UserService.php | Refactors getById to use findOr method with type hints |
| app/Services/SquidUserService.php | Refactors getById to use findOr method with type hints |
| app/Services/SquidAllowedIpService.php | Refactors getById to use findOr method with type hints |
| app/Models/User.php | Adds type hints to password setter method |
| app/Http/Requests/SquidUser/BulkImportRequest.php | New form request for bulk import validation and CSV parsing |
| app/Http/Controllers/Gui/UserController.php | Refactors constructor to use promoted properties |
| app/Http/Controllers/Gui/SquidUserController.php | Adds bulk import methods and refactors constructor |
| .github/workflows/laravel.yml | Adds composer lock update before install |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| public function parseCsv(): array | ||
| { | ||
| $file = $this->file('csv_file'); | ||
| $handle = fopen($file->getRealPath(), 'r'); |
Copilot
AI
Oct 25, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The file handle from fopen is not checked for failure. If fopen fails, subsequent operations on a false handle will cause errors. Add error handling after opening the file.
| $handle = fopen($file->getRealPath(), 'r'); | |
| $handle = fopen($file->getRealPath(), 'r'); | |
| if ($handle === false) { | |
| throw new \RuntimeException('Unable to open CSV file for reading.'); | |
| } |
| $rows = []; | ||
| $header = null; | ||
|
|
||
| while (($data = fgetcsv($handle, 1000, ',')) !== false) { |
Copilot
AI
Oct 25, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The hardcoded buffer size of 1000 bytes is too small for CSV rows with many or large fields, potentially truncating data. Consider increasing to a larger value like 10000 or using 0 for no limit.
| while (($data = fgetcsv($handle, 1000, ',')) !== false) { | |
| while (($data = fgetcsv($handle, 0, ',')) !== false) { |
| DB::beginTransaction(); | ||
|
|
||
| try { | ||
| foreach ($rows as $index => $row) { | ||
| try { | ||
| $squidUser = SquidUser::query() | ||
| ->where('user', $row['user'] ?? '') | ||
| ->where('user_id', $userId) | ||
| ->first(); | ||
|
|
||
| if (!$squidUser) { | ||
| $results['failed']++; | ||
| $results['errors'][] = "Row " . ($index + 2) . ": User '{$row['user']}' not found"; | ||
| continue; | ||
| } |
Copilot
AI
Oct 25, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The transaction is committed even when individual row operations fail. According to the documentation in bulk_import.blade.php line 92, 'if one fails, all will be rolled back', but the code continues processing and commits the transaction despite failures. Consider rolling back when any failure occurs, or update the documentation to match the actual partial-success behavior.
| $results['failed']++; | ||
| $results['errors'][] = "Row " . ($index + 2) . ": User '{$row['user']}' not found"; | ||
| continue; | ||
| } | ||
|
|
||
| $squidUser->delete(); | ||
| $results['success']++; | ||
| } catch (\Exception $e) { | ||
| $results['failed']++; | ||
| $results['errors'][] = "Row " . ($index + 2) . ": " . $e->getMessage(); | ||
| } | ||
| } | ||
|
|
||
| DB::commit(); | ||
| } catch (\Exception $e) { | ||
| DB::rollBack(); | ||
| throw $e; |
Copilot
AI
Oct 25, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The transaction is committed even when individual row operations fail. According to the documentation in bulk_import.blade.php line 92, 'if one fails, all will be rolled back', but the code continues processing and commits the transaction despite failures. Consider rolling back when any failure occurs, or update the documentation to match the actual partial-success behavior.
| $results['failed']++; | |
| $results['errors'][] = "Row " . ($index + 2) . ": User '{$row['user']}' not found"; | |
| continue; | |
| } | |
| $squidUser->delete(); | |
| $results['success']++; | |
| } catch (\Exception $e) { | |
| $results['failed']++; | |
| $results['errors'][] = "Row " . ($index + 2) . ": " . $e->getMessage(); | |
| } | |
| } | |
| DB::commit(); | |
| } catch (\Exception $e) { | |
| DB::rollBack(); | |
| throw $e; | |
| throw new \Exception("Row " . ($index + 2) . ": User '{$row['user']}' not found"); | |
| } | |
| $squidUser->delete(); | |
| $results['success']++; | |
| } catch (\Exception $e) { | |
| throw $e; | |
| } | |
| } | |
| DB::commit(); | |
| } catch (\Exception $e) { | |
| DB::rollBack(); | |
| $results['failed'] = count($rows); | |
| $results['errors'][] = $e->getMessage(); | |
| return $results; |
| } catch (\Exception $e) { | ||
| $results['failed']++; | ||
| $results['errors'][] = "Row " . ($index + 2) . ": " . $e->getMessage(); |
Copilot
AI
Oct 25, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The transaction is committed even when individual row operations fail. According to the documentation in bulk_import.blade.php line 92, 'if one fails, all will be rolled back', but the code continues processing and commits the transaction despite failures. Consider rolling back when any failure occurs, or update the documentation to match the actual partial-success behavior.
| } catch (\Exception $e) { | |
| $results['failed']++; | |
| $results['errors'][] = "Row " . ($index + 2) . ": " . $e->getMessage(); | |
| } catch (\Exception $e) { | |
| DB::rollBack(); | |
| $results['failed']++; | |
| $results['errors'][] = "Row " . ($index + 2) . ": " . $e->getMessage(); | |
| throw $e; |
Fixed issues identified in code review: 1. BulkImportRequest.php: - Fixed authorize() method to use authenticated user's ID instead of non-existent route parameter - Enhanced CSV parsing with proper error handling - Added validation for empty files and column count mismatches - Improved trimming of CSV data - Added try-finally block to ensure file handle is always closed 2. Transaction handling in all Bulk*Action classes: - Changed from partial rollback to all-or-nothing approach - If any row fails, entire transaction is rolled back - More consistent with ACID principles - Better error reporting Changes: - BulkImportRequest: authorize() now uses $this->user()->id - BulkImportRequest: parseCsv() enhanced with error handling - BulkCreateAction: Simplified transaction logic - BulkUpdateAction: Simplified transaction logic - BulkDeleteAction: Simplified transaction logic All operations now follow atomic transaction pattern: - Success: All rows processed and committed - Failure: All changes rolled back on first error 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Addressed Copilot review comment: - Increased CSV buffer size from 1000 to 0 (unlimited) in fgetcsv() - This prevents data truncation for CSV rows with many or large fields - The hardcoded 1000-byte buffer was too small and could cause incomplete data reads Changes: - BulkImportRequest.php: fgetcsv($handle, 1000, ',') → fgetcsv($handle, 0, ',') Using 0 as the buffer size allows reading CSV rows of any length without truncation. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Copilot reviewed 16 out of 16 changed files in this pull request and generated 3 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| DB::commit(); | ||
| } catch (\Exception $e) { | ||
| DB::rollBack(); | ||
| $results['failed'] = count($rows) - $results['success']; |
Copilot
AI
Oct 25, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The failed count calculation is incorrect. If an exception occurs during the first iteration, $results['success'] is 0, but count($rows) includes all rows. The correct failed count should be count($rows) since the entire transaction is rolled back, meaning zero rows were successfully processed.
| $results['failed'] = count($rows) - $results['success']; | |
| $results['failed'] = count($rows); |
| DB::commit(); | ||
| } catch (\Exception $e) { | ||
| DB::rollBack(); | ||
| $results['failed'] = count($rows) - $results['success']; |
Copilot
AI
Oct 25, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The failed count calculation is incorrect. Since the transaction is rolled back, all rows fail to update. The failed count should be count($rows), not count($rows) - $results['success'].
| $results['failed'] = count($rows) - $results['success']; | |
| $results['failed'] = count($rows); |
| DB::commit(); | ||
| } catch (\Exception $e) { | ||
| DB::rollBack(); | ||
| $results['failed'] = count($rows) - $results['success']; |
Copilot
AI
Oct 25, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The failed count calculation is incorrect. When a transaction is rolled back, all operations fail, so the failed count should be count($rows), not count($rows) - $results['success'].
| $results['failed'] = count($rows) - $results['success']; | |
| $results['failed'] = count($rows); |
Addressed Copilot review comments on failed count logic: When a transaction is rolled back, ALL rows fail (not just the remaining ones). The previous calculation was incorrect: - ❌ Before: failed = count($rows) - $results['success'] - ✅ After: failed = count($rows), success = 0 Changes in all Bulk*Action classes: - BulkCreateAction.php: Fixed failed count on rollback - BulkUpdateAction.php: Fixed failed count on rollback - BulkDeleteAction.php: Fixed failed count on rollback When a transaction rolls back: - All previously "successful" operations are also undone - Therefore, success count = 0 and failed count = total rows - This correctly reflects the all-or-nothing transaction behavior 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Copilot reviewed 16 out of 16 changed files in this pull request and generated 2 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| if (isset($row['password']) && !empty($row['password'])) { | ||
| $squidUser->password = $row['password']; | ||
| } | ||
| if (isset($row['enabled'])) { |
Copilot
AI
Oct 25, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The 'enabled' field should not be updated when it's an empty string. Add a check for !empty($row['enabled']) similar to the password field to prevent setting enabled to an empty string value.
| if (isset($row['enabled'])) { | |
| if (isset($row['enabled']) && !empty($row['enabled'])) { |
| if (isset($row['fullname'])) { | ||
| $squidUser->fullname = $row['fullname']; | ||
| } | ||
| if (isset($row['comment'])) { |
Copilot
AI
Oct 25, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Empty strings in 'fullname' and 'comment' fields will overwrite existing values. According to the view's documentation stating 'Empty fields will not be updated', add !empty() checks to these conditions to prevent unintentional data loss.
| if (isset($row['fullname'])) { | |
| $squidUser->fullname = $row['fullname']; | |
| } | |
| if (isset($row['comment'])) { | |
| if (isset($row['fullname']) && !empty($row['fullname'])) { | |
| $squidUser->fullname = $row['fullname']; | |
| } | |
| if (isset($row['comment']) && !empty($row['comment'])) { |
No description provided.