Skip to content

Addition of FAIR's MADGRAD Optimizer #2452

@DarshanDeshpande

Description

@DarshanDeshpande

Describe the feature and the current behavior/state.

Relevant information

  • Are you willing to contribute it (yes/no): yes
    If you wish to contribute, then read the requirements for new contributions in CONTRIBUTING.md
  • Are you willing to maintain it going forward? (yes/no): yes
  • Is there a relevant academic paper? (if so, where): yes. Aaron Defazio and Samy Jelassi, 2021
  • Does the relavent academic paper exceed 50 citations? (yes/no): no
  • Is there already an implementation in another framework? (if so, where): Pytorch, JAX
  • Was it part of tf.contrib? (if so, where): no

Which API type would this fall under (layer, metric, optimizer, etc.)
Optimizer

Who will benefit with this feature?
The Madgrad algorithm, presented by Facebook's AI Research Lab, consistently matches or outperforms Adam and SGD algorithms. The code for this algorithm is available for Pytorch and JAX. The addition of this optimizer to TF-Addons will make this implementation available for Tensorflow users.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions