Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Numpy Import Convention #3307

Merged
merged 4 commits into from
Dec 21, 2023
Merged

Numpy Import Convention #3307

merged 4 commits into from
Dec 21, 2023

Conversation

RatishT
Copy link
Contributor

@RatishT RatishT commented Dec 17, 2023

Dear Pyro contributors,

I've taken the initiative to address and resolve several linting warnings pointed out by a pylint plugin called dslinter. The code suggestions that it gave were:

  1. torch.log() Variable Clamping:
    • Enclosed the variable in the torch.log() operation with torch.clamp() to establish a minimum value. This modification ensures a safeguard against potential NaN errors, particularly when the input variable is exceptionally small.
    • Before:
      result = torch.log(input_var)
    • After:
      result = torch.log(torch.clamp(input_var, min=1e-10))
  2. Numpy Import Convention:
    • I added "as np" to a numpy import, matching standard convention and consistency in code

I invite your valuable insights and feedback on these modifications to guarantee that they align seamlessly with our shared coding standards and project requirements.

Added torch.clamp in torch.log to avoid too small numbers, which would result in NaN
Added "as np" to numpy import, matching standard convention and consistency in code
@martinjankowiak
Copy link
Collaborator

thanks but i don't think the log change is necessary. in any case if we were to do something like that we would use torch.finfo and not some ad hoc number like 1e-10.

the np change looks fine to me

@RatishT
Copy link
Contributor Author

RatishT commented Dec 19, 2023

thanks but i don't think the log change is necessary. in any case if we were to do something like that we would use torch.finfo and not some ad hoc number like 1e-10.

the np change looks fine to me

Thank you for your reply! I have reverted the torch.clamp in the torch.log calls
For future reference, would it be better to use variable.clamp(min=torch.finfo(variable.dtype).tiny)?

@fritzo fritzo changed the title Fix torch.log() NaN Issue and Numpy Import Convention Numpy Import Convention Dec 19, 2023
@ordabayevy ordabayevy merged commit 579163a into pyro-ppl:dev Dec 21, 2023
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants