Skip to content

Conversation

@mahlau-flex
Copy link
Contributor

@mahlau-flex mahlau-flex commented Nov 26, 2025

Fixed the gradient computation in smoothed_projection for beta=inf. The problem was that the existing tanh_projection did not handle beta=inf properly.

Greptile Overview

Greptile Summary

Fixed gradient computation in tanh_projection when beta=inf by adding an explicit step function case using np.where, preventing NaN/inf gradients that occurred with the previous tanh formula. The fix enables proper gradient-based optimization in topology design with completely binarized projections.

Key changes:

  • Added beta == np.inf special case in tanh_projection (projections.py:70-71) returning step function np.where(array > eta, 1.0, 0.0)
  • Refactored tests to use circle_2d_factory pytest fixture eliminating code duplication
  • Added comprehensive parametrized test test_projection_gradient_correctness using autograd's check_grads to verify forward and reverse mode gradients up to 2nd order
  • Test coverage now includes beta=np.inf case across multiple array sizes, radii, and smoothing parameters

Confidence Score: 4/5

  • This PR is safe to merge with minor considerations about edge case documentation
  • The fix correctly addresses the gradient computation issue for beta=inf by using a step function instead of the tanh formula that causes numerical instability. Comprehensive test coverage validates gradients across multiple scenarios including the beta=inf case. The only minor concern is documentation clarity around the array == eta edge case behavior.
  • No files require special attention - the implementation is sound and well-tested

Important Files Changed

File Analysis

Filename Score Overview
tidy3d/plugins/autograd/invdes/projections.py 4/5 Added special case handling for beta=np.inf in tanh_projection to prevent NaN/inf gradients by using step function
tests/test_plugins/autograd/invdes/test_projections.py 5/5 Refactored tests to use fixture for circle generation, added comprehensive gradient correctness tests with parametrized beta=np.inf cases

Sequence Diagram

sequenceDiagram
    participant User
    participant smoothed_projection
    participant tanh_projection
    participant autograd
    
    User->>smoothed_projection: call with array, beta=inf, eta
    smoothed_projection->>tanh_projection: compute original_projected
    
    alt beta == 0
        tanh_projection-->>smoothed_projection: return array (unchanged)
    else beta == inf (NEW FIX)
        tanh_projection->>tanh_projection: np.where(array > eta, 1.0, 0.0)
        tanh_projection-->>smoothed_projection: return step function
    else finite beta
        tanh_projection->>tanh_projection: compute tanh formula
        tanh_projection-->>smoothed_projection: return smooth projection
    end
    
    smoothed_projection->>smoothed_projection: compute gradients & smoothing
    smoothed_projection->>tanh_projection: compute rho_minus_eff_projected
    smoothed_projection->>tanh_projection: compute rho_plus_eff_projected
    smoothed_projection->>smoothed_projection: blend with polynom weights
    smoothed_projection-->>User: return smoothed result
    
    User->>autograd: compute gradient
    autograd->>smoothed_projection: backpropagate
    smoothed_projection->>tanh_projection: gradient through step function (when beta=inf)
    Note over tanh_projection,autograd: np.where has zero gradient<br/>(doesn't cause NaN/inf)
    autograd-->>User: return finite gradients
Loading

@mahlau-flex mahlau-flex requested review from yaugenst-flex and removed request for yaugenst-flex November 26, 2025 11:28
Copy link

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 files reviewed, 1 comment

Edit Code Review Agent Settings | Greptile

Copy link
Collaborator

@yaugenst-flex yaugenst-flex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@mahlau-flex mahlau-flex force-pushed the FXC-4339-Fix-gradient-for-smoothed-projection branch from b9f06c4 to 1f8a666 Compare November 26, 2025 11:45
@mahlau-flex mahlau-flex force-pushed the FXC-4339-Fix-gradient-for-smoothed-projection branch from 1f8a666 to c7dedc0 Compare November 26, 2025 11:49
@mahlau-flex mahlau-flex disabled auto-merge November 26, 2025 11:58
@mahlau-flex mahlau-flex force-pushed the FXC-4339-Fix-gradient-for-smoothed-projection branch from c7dedc0 to 3478ad3 Compare November 26, 2025 12:05
@github-actions
Copy link
Contributor

Diff Coverage

Diff: origin/develop...HEAD, staged and unstaged changes

  • tidy3d/plugins/autograd/invdes/projections.py (100%)

Summary

  • Total: 2 lines
  • Missing: 0 lines
  • Coverage: 100%

@mahlau-flex mahlau-flex added this pull request to the merge queue Nov 26, 2025
Merged via the queue into develop with commit 0f46e40 Nov 26, 2025
19 checks passed
@mahlau-flex mahlau-flex deleted the FXC-4339-Fix-gradient-for-smoothed-projection branch November 26, 2025 13:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants