chore(invdes): fixed smoothed projection gradient with beta=inf #3036
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fixed the gradient computation in smoothed_projection for beta=inf. The problem was that the existing tanh_projection did not handle beta=inf properly.
Greptile Overview
Greptile Summary
Fixed gradient computation in
tanh_projectionwhenbeta=infby adding an explicit step function case usingnp.where, preventing NaN/inf gradients that occurred with the previous tanh formula. The fix enables proper gradient-based optimization in topology design with completely binarized projections.Key changes:
beta == np.infspecial case intanh_projection(projections.py:70-71) returning step functionnp.where(array > eta, 1.0, 0.0)circle_2d_factorypytest fixture eliminating code duplicationtest_projection_gradient_correctnessusing autograd'scheck_gradsto verify forward and reverse mode gradients up to 2nd orderbeta=np.infcase across multiple array sizes, radii, and smoothing parametersConfidence Score: 4/5
beta=infby using a step function instead of the tanh formula that causes numerical instability. Comprehensive test coverage validates gradients across multiple scenarios including thebeta=infcase. The only minor concern is documentation clarity around thearray == etaedge case behavior.Important Files Changed
File Analysis
beta=np.infintanh_projectionto prevent NaN/inf gradients by using step functionbeta=np.infcasesSequence Diagram
sequenceDiagram participant User participant smoothed_projection participant tanh_projection participant autograd User->>smoothed_projection: call with array, beta=inf, eta smoothed_projection->>tanh_projection: compute original_projected alt beta == 0 tanh_projection-->>smoothed_projection: return array (unchanged) else beta == inf (NEW FIX) tanh_projection->>tanh_projection: np.where(array > eta, 1.0, 0.0) tanh_projection-->>smoothed_projection: return step function else finite beta tanh_projection->>tanh_projection: compute tanh formula tanh_projection-->>smoothed_projection: return smooth projection end smoothed_projection->>smoothed_projection: compute gradients & smoothing smoothed_projection->>tanh_projection: compute rho_minus_eff_projected smoothed_projection->>tanh_projection: compute rho_plus_eff_projected smoothed_projection->>smoothed_projection: blend with polynom weights smoothed_projection-->>User: return smoothed result User->>autograd: compute gradient autograd->>smoothed_projection: backpropagate smoothed_projection->>tanh_projection: gradient through step function (when beta=inf) Note over tanh_projection,autograd: np.where has zero gradient<br/>(doesn't cause NaN/inf) autograd-->>User: return finite gradients