Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Normalizer class added #148

Merged
merged 1 commit into from
Mar 5, 2020
Merged

Normalizer class added #148

merged 1 commit into from
Mar 5, 2020

Conversation

ndiamant
Copy link
Contributor

@ndiamant ndiamant commented Mar 2, 2020

No description provided.

Copy link
Contributor

@paolodi paolodi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nate this looks great and good to merge for me! I've added just a couple of comments on whether we should allow to have a normalize without un_normalize or viceversa. Love the extension! Please feel free to address the comments as you prefer, and you can definitely merge without going through me again.

def un_normalize(self, tensor: np.ndarray) -> np.ndarray:
"""The inverse of normalize if possible. Otherwise identity."""
return tensor

Copy link
Contributor

@paolodi paolodi Mar 5, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I love this! The only comment here is that this may encourage us to be a bit lazy and maybe provide Normalizers with just one of the two methods. Are there cases where one might actually want to have just normalize without un_normalize, or the opposite?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would have made un_normalize an abstract method also, but zero_mean_std1 is un-invertible, so I had to allow that use case. The un_normalize I think is used almost exclusively for plotting anyway

def normalize(self, tensor: np.ndarray) -> np.ndarray:
tensor -= np.mean(tensor)
tensor /= np.std(tensor) + EPS
return tensor
Copy link
Contributor

@paolodi paolodi Mar 5, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder whether here we could store the mean and std in self (like done above for Standardize) and hence allow to have an un_normalize.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think a Normalizer could/should keep track of the mean and std for every tensor it's called on. Then it would have to store a dictionary and get called with a sample id.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed, that would be an overkill, especially since we do not un_normalize that often. If we really need the un-normalized tensor and no method is available, we can always let the code re-evaluate tensor_from_file without normalization.

@ndiamant ndiamant merged commit 975a3a7 into master Mar 5, 2020
@ndiamant ndiamant deleted the nd_normalizers branch March 5, 2020 15:29
lucidtronix pushed a commit that referenced this pull request Jan 13, 2023
lucidtronix pushed a commit that referenced this pull request Jan 13, 2023
lucidtronix pushed a commit that referenced this pull request Jan 13, 2023
lucidtronix pushed a commit that referenced this pull request Jan 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants