Skip to content

RBAC: Optimize ExpireRoles Command (Follow-up to PR #118) #119

@kevalyq

Description

@kevalyq

Context

This is a follow-up issue for PR #118 (RBAC Phase 2) based on Copilot review comments after merge. The ExpireRoles command is functional and tested, but has potential optimizations for production use.

Copilot Review Comments

1. Code Style Consistency (Nitpick)

File: app/Console/Commands/ExpireRoles.php:10

The declare(strict_types=1); directive is inconsistent with other console commands in the codebase. Other commands like GenerateTenantCommand, RotateDekCommand, and RotateKekCommand don't include this directive.

Action: Remove for consistency (or add to all commands project-wide)

2. Performance: Single Transaction vs. Per-Item Transactions

File: app/Console/Commands/ExpireRoles.php:53-76

Using a separate transaction for each expired role could be inefficient when processing many expired roles. Consider wrapping the entire foreach loop in a single transaction, or batch process the deletions.

Suggested approach:

DB::transaction(function () use ($expired, &$count) {
    foreach ($expired as $assignment) {
        // Log and delete operations here
        $count++;
    }
});

3. Memory Efficiency: Chunk/Cursor Processing

File: app/Console/Commands/ExpireRoles.php:41-75

Loading all expired roles into memory with get() could cause memory issues if there are many expired roles to process.

Suggested approach:

TemporalRoleUser::expired()->chunk(100, function ($expired) use (&$count) {
    foreach ($expired as $assignment) {
        DB::transaction(function () use ($assignment, &$count) {
            // ... existing logic
        });
    }
});

4. Race Condition: Concurrent Scheduler Execution

File: app/Console/Commands/ExpireRoles.php:55-74

Potential race condition if multiple scheduler instances run simultaneously. If two processes fetch the same expired roles, both will create audit log entries before deletion, resulting in duplicate audit logs.

Suggested approach:

DB::transaction(function () use ($assignment, &$count) {
    $deleted = DB::table('model_has_roles')
        ->where('model_type', $assignment->model_type)
        ->where('model_id', $assignment->model_id)
        ->where('role_id', $assignment->role_id)
        ->where('tenant_id', $assignment->tenant_id)
        ->delete();
    
    // Only log if we actually deleted something
    if ($deleted > 0) {
        RoleAssignmentLog::create([...]);
        $count++;
    }
});

Or use lockForUpdate() on the query.

Proposed Solution

Phase 1: Address Critical Issues (Race Condition + Memory)

  1. Reverse audit log order: Delete first, then log only if deleted > 0
  2. Use chunk() processing: Process in batches of 100 to avoid memory issues
  3. Single transaction per chunk: Wrap each chunk in one transaction

Phase 2: Style & Performance

  1. Remove declare(strict_types=1): Align with other commands
  2. Add comprehensive tests: Test chunk processing, race conditions, and performance with large datasets

Acceptance Criteria

  • No memory issues when processing 1000+ expired roles
  • No duplicate audit logs on concurrent execution
  • Code style consistent with other commands
  • All existing tests pass (10 tests, 31 assertions)
  • New tests for chunk processing and race conditions
  • PHPStan Level 9 compliant
  • Pint compliant

Related

Priority

Medium - Current implementation is functional and tested, but optimizations are important for production scalability and data integrity.

Links

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    Status

    ✅ Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions