Skip to content

Conversation

patrickvonplaten
Copy link
Contributor

@patrickvonplaten patrickvonplaten commented Aug 24, 2022

This PRs finishes the easy refactoring (just deleting old code).

All slow tests pass + stable diffusion pipeline was tested for subjective image quality and all looks good 👍

Now the big single letter variable renaming can begin

@patrickvonplaten patrickvonplaten changed the title [Resnet] Clean Resnet from all unused functionality Clean unnecessary files Aug 24, 2022
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

@@ -390,7 +390,7 @@ def from_pretrained(cls, pretrained_model_name_or_path: Optional[Union[str, os.P
)
except EntryNotFoundError:
raise EnvironmentError(
f"{pretrained_model_name_or_path} does not appear to have a file named {model_file}."
f"{pretrained_model_name_or_path} does not appear to have a file named {WEIGHTS_NAME}."
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

model file is actually not yet defined when this error is hit


import torch
import torch.nn.functional as F
from torch import nn


class AttentionBlockNew(nn.Module):
class AttentionBlock(nn.Module):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's call it AttentionBlock and not AttentionBlockNew

@patrickvonplaten
Copy link
Contributor Author

Solves #199

@@ -234,7 +178,7 @@ def forward(self, x, context=None, mask=None):
h = self.heads

q = self.to_q(x)
context = default(context, x)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

death to one-line functions

@@ -280,155 +224,3 @@ def __init__(self, dim_in, dim_out):
def forward(self, x):
x, gate = self.proj(x).chunk(2, dim=-1)
return x * F.gelu(gate)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of this code is not used anymore

from .attention import AttentionBlockNew, SpatialTransformer
from .resnet import Downsample2D, FirDownsample2D, FirUpsample2D, ResnetBlock, Upsample2D
from .attention import AttentionBlock, SpatialTransformer
from .resnet import Downsample2D, FirDownsample2D, FirUpsample2D, ResnetBlock2D, Upsample2D
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2D is more in line with all the other imports here

@patrickvonplaten patrickvonplaten changed the title Clean unnecessary files [Clean up] Clean unused code Aug 24, 2022
Copy link
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! Thanks a lot for cleaning this.

@@ -234,7 +178,7 @@ def forward(self, x, context=None, mask=None):
h = self.heads

q = self.to_q(x)
context = default(context, x)
context = context if context is not None else x
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Much better :D

@patil-suraj patil-suraj merged commit c1efda7 into main Aug 25, 2022
@patil-suraj patil-suraj deleted the clean_resnet_file branch August 25, 2022 09:56
natolambert pushed a commit that referenced this pull request Sep 7, 2022
* CleanResNet

* refactor more

* correct
yoonseokjin pushed a commit to yoonseokjin/diffusers that referenced this pull request Dec 25, 2023
* CleanResNet

* refactor more

* correct
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants