Skip to content

Restore #460 seed-removal semantics after #339#477

Merged
CompRhys merged 2 commits intomainfrom
restore-460-seed-removal
Mar 1, 2026
Merged

Restore #460 seed-removal semantics after #339#477
CompRhys merged 2 commits intomainfrom
restore-460-seed-removal

Conversation

@janosh
Copy link
Collaborator

@janosh janosh commented Mar 1, 2026

  • restore intentional Introduce PRNG to SimState and add reproducibility docs. #460 API behavior where NVE/NVT/NPT init functions do not take a seed kwarg and instead use state.rng
  • switch integrator init paths back to initialize_momenta(..., generator=state.rng) and remove the reintroduced calculate_momenta helper
  • update affected examples/tutorials and test helper to follow state-bound RNG seeding (state.rng = ...)

Context

These are surgical restorations of intentional changes from #460 that were unintentionally lost during the large ty-fixing merge in #339.

@CompRhys sorry about this oversight on my side while polishing the #339 LLM type fixes. I think this surgical restore might be easier to review than the broader revert path in #476.

- remove reintroduced seed kwargs from NVE/NVT/NPT init APIs and route seeding through state.rng again
- restore initialize_momenta usage in integrator init paths and drop calculate_momenta helper reintroduced by #339 edits
- update affected examples/tutorial and testing helper to use state-bound RNG behavior
@janosh janosh requested a review from CompRhys March 1, 2026 10:49
@thomasloux
Copy link
Collaborator

Overall looks good to me, but I haven't checked that it covered all cases of seed.

Revert DOF aggregation in constraint DOF helper back to per-system tensor behavior and align tests with explicit per-system assertions.
@thomasloux
Copy link
Collaborator

@janosh actually I've added count_degrees_of_freedom when I added constraints. But to be honest, this function is not used in torch_sim and actually a user should use state.constraints = my_constraints and then use state.get_degrees_of_freedom. I'm preparing a PR to remove this function.

@thomasloux
Copy link
Collaborator

If it happens that some people use this function, actually the fix is change the type annotation. It should return a torch.Tensor and not a int. That's actually the error on my side + changing the check for non negativity.

@janosh janosh mentioned this pull request Mar 1, 2026
3 tasks
@CompRhys
Copy link
Member

CompRhys commented Mar 1, 2026

@CompRhys sorry about this oversight on my side while polishing the #339 LLM type fixes.

No harm :) as long as we end up eventually correct when we make a release that's all that matters

@CompRhys
Copy link
Member

CompRhys commented Mar 1, 2026

Will merge this and let #479 resolve itself separately

Copy link
Member

@CompRhys CompRhys left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@CompRhys CompRhys merged commit 070bcd7 into main Mar 1, 2026
68 checks passed
@CompRhys CompRhys deleted the restore-460-seed-removal branch March 1, 2026 12:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants