Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove default std normalization value in base class #2630

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

nilsleh
Copy link
Collaborator

@nilsleh nilsleh commented Mar 5, 2025

While not technically a bug, I think this is quiet inconvenient, when building custom datamodules inheriting from torchgeo base classes. For my use case, I had trouble understanding why my data was all of the sudden squashed to a tiny range when I had properly normalized it beforehand.

I think the default value here should be 1, i.e nothing happens, and users have to otherwise overwrite it. At the moment it is a base class for RGB data, which is oddly specific, and I think it would be better to keep it more general, especially because it might not be clear to all users, that the normalization "secretly" happens in the on_after_batch_transfer hook.

Closes #1841

@nilsleh nilsleh added this to the 0.6.3 milestone Mar 5, 2025
@github-actions github-actions bot added the datamodules PyTorch Lightning datamodules label Mar 5, 2025
@adamjstewart adamjstewart added the backwards-incompatible Changes that are not backwards compatible label Mar 5, 2025
@adamjstewart adamjstewart modified the milestones: 0.6.3, 0.7.0 Mar 5, 2025
Copy link
Collaborator

@adamjstewart adamjstewart left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with this change (principle of least surprise) but we should set std for all the datamodules that don't currently specify it. I'm fine with just using 255 in all of them to maintain backwards compatibility but it would be good to at least check data.py and see what dtype the files use.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backwards-incompatible Changes that are not backwards compatible datamodules PyTorch Lightning datamodules
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Datamodule augmentation defaults
2 participants