-
Notifications
You must be signed in to change notification settings - Fork 503
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix Python 3.6 f-string compatibility and condense documentation for FNO classes #209
Conversation
All parameters besides n_modes.* args are repeated verbatim in documentation. This has already caused drift in maintaining the docstrings for the class. Remove repeated parameters and let the subclasses refer back up to the superclass' documentation.
I don't think we need to abandon new features like f-strings that make the code easier to read just to support Python 3.6: it was released in 2016 and has reached end of life. |
Other changes are great! |
|
||
if isinstance(padding_fraction, float) or isinstance(padding_fraction, int): | ||
if isinstance(padding_fraction, (float, int)): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, that's much better
B = size[0]//(self.n_patches[0]*self.n_patches[1]) | ||
W = size[3]*self.n_patches[1] | ||
B = size[0] // (self.n_patches[0] * self.n_patches[1]) | ||
W = size[3] * self.n_patches[1] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I really don't like the way black
formats operations - it actually doesn't follow pep8..
self.padding_height = padding[0] | ||
self.padding_width = padding[1] | ||
|
||
patched = make_patches(x, n=2**self.levels, p=padding) | ||
|
||
s1_patched = patched.size(-2) - 2*padding[0] | ||
s2_patched = patched.size(-1) - 2*padding[1] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same, is there a way to force black to follow pep8 for those, or ignore them?
if s1_pad > x_sub.size(-2): | ||
diff = s1_pad - x_sub.size(-2) | ||
x_sub = torch.nn.functional.pad(x_sub, pad=[0, 0, x_sub.size(-2), x_sub.size(-2)], mode='circular') | ||
x_sub = torch.nn.functional.pad(x_sub, pad=[0, 0, diff, diff], mode='circular') | ||
x_sub = torch.nn.functional.pad( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's another case where I find the formatted version actually marginally less readable
Done. I've dropped that commit. |
Great, thanks @m4e7, merging! |
There are several instances of
f"{x=}"
in the repo, which causes issues when trying to import the library in to Python 3.6. Drop the=
in these f-string to fix this problem.Additionally, the repeated docstrings between base class
FNO
and subclassesFNOnd
make maintaining documentation more work than it needs to be. The parameters are used identically in the base- and sub-class, and in some cases the subclass documentation is already out of date. Delete the repeated parameter comments in favor of referring users back up to the full documentation ofFNO
(tested in a localmake
ofsphinx-gallery
).Lastly, run
black
on touched files.Reviewer(s): @zongyi-li @JeanKossaifi