Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Python 3.6 f-string compatibility and condense documentation for FNO classes #209

Merged
merged 6 commits into from
Sep 5, 2023

Conversation

m4e7
Copy link
Contributor

@m4e7 m4e7 commented Aug 30, 2023

There are several instances of f"{x=}" in the repo, which causes issues when trying to import the library in to Python 3.6. Drop the = in these f-string to fix this problem.

Additionally, the repeated docstrings between base class FNO and subclasses FNOnd make maintaining documentation more work than it needs to be. The parameters are used identically in the base- and sub-class, and in some cases the subclass documentation is already out of date. Delete the repeated parameter comments in favor of referring users back up to the full documentation of FNO (tested in a local make of sphinx-gallery).

Lastly, run black on touched files.

Reviewer(s): @zongyi-li @JeanKossaifi

All parameters besides n_modes.* args are repeated verbatim in documentation.
This has already caused drift in maintaining the docstrings for the class.

Remove repeated parameters and let the subclasses refer back up to the
superclass' documentation.
@JeanKossaifi
Copy link
Member

I don't think we need to abandon new features like f-strings that make the code easier to read just to support Python 3.6: it was released in 2016 and has reached end of life.

@JeanKossaifi
Copy link
Member

Other changes are great!


if isinstance(padding_fraction, float) or isinstance(padding_fraction, int):
if isinstance(padding_fraction, (float, int)):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, that's much better

B = size[0]//(self.n_patches[0]*self.n_patches[1])
W = size[3]*self.n_patches[1]
B = size[0] // (self.n_patches[0] * self.n_patches[1])
W = size[3] * self.n_patches[1]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I really don't like the way black formats operations - it actually doesn't follow pep8..

self.padding_height = padding[0]
self.padding_width = padding[1]

patched = make_patches(x, n=2**self.levels, p=padding)

s1_patched = patched.size(-2) - 2*padding[0]
s2_patched = patched.size(-1) - 2*padding[1]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same, is there a way to force black to follow pep8 for those, or ignore them?

if s1_pad > x_sub.size(-2):
diff = s1_pad - x_sub.size(-2)
x_sub = torch.nn.functional.pad(x_sub, pad=[0, 0, x_sub.size(-2), x_sub.size(-2)], mode='circular')
x_sub = torch.nn.functional.pad(x_sub, pad=[0, 0, diff, diff], mode='circular')
x_sub = torch.nn.functional.pad(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's another case where I find the formatted version actually marginally less readable

@m4e7
Copy link
Contributor Author

m4e7 commented Sep 1, 2023

I don't think we need to abandon new features like f-strings that make the code easier to read just to support Python 3.6: it was released in 2016 and has reached end of life.

Done. I've dropped that commit.

@JeanKossaifi
Copy link
Member

Great, thanks @m4e7, merging!

@JeanKossaifi JeanKossaifi merged commit 800087c into neuraloperator:main Sep 5, 2023
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants