Skip to content

[proposal] [discussion] Refactor pruning/weight_norm using new Reparametrization functionality + actually deprecate old impl of SpectralNorm #7313

@vadimkantorov

Description

@vadimkantorov

Currently the weight_norm and spectral_norm are patching a passed module + implement special functions for adding/removing these from a module.

Some ideas for refactoring to make it less tricky:

  • provide a stable signature for getting weight, then they can be cleanly used with methods such as torch.matmul and F.conv2d
  • if module patching (adding some new buffers as parameters and registering a hook) is needed and is a reasonable pattern, provide a user-facing stable abstraction for it (especially adding and removal of parameters). It seems we have a chain of decorators-hooks, and each of them may have some extra buffers, and currently they are all patched into the passed module object.

cc @jerryzh168 @jianyuh @dzhulgakov

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions