Skip to content

feat: add configurable gradient threshold to ABF and ABFPlusPlus (A1)#40

Merged
csparker247 merged 1 commit intodevelopfrom
feat/a1-configurable-gradient-threshold
Mar 16, 2026
Merged

feat: add configurable gradient threshold to ABF and ABFPlusPlus (A1)#40
csparker247 merged 1 commit intodevelopfrom
feat/a1-configurable-gradient-threshold

Conversation

@csparker247
Copy link
Copy Markdown
Member

Summary

  • Adds setGradientThreshold(T) method to both ABF and ABFPlusPlus alongside the existing setMaxIterations()
  • Adds gradThreshold parameter (defaulting to T(0.001)) to the full static Compute() overload on both classes
  • Replaces the hardcoded 0.001 literal in the solver while loop with the configurable parameter
  • Fully backward-compatible: default value preserves existing behavior

Test plan

  • All existing tests pass unchanged (10/10 parameterization tests pass)
  • ABFPlusPlus_TightThreshold: tightening threshold to 1e-6 produces a lower final gradient than the default
  • ABFPlusPlus_LooseThreshold: threshold of 1e6 (above initial gradient) causes zero solver iterations

Closes #15

🤖 Generated with Claude Code

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@csparker247 csparker247 merged commit 598035e into develop Mar 16, 2026
7 checks passed
@csparker247 csparker247 deleted the feat/a1-configurable-gradient-threshold branch March 16, 2026 15:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[A1] Make convergence tolerance configurable in ABF and ABFPlusPlus

1 participant