-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Relax][Training] Refactor Optimizer and Gradient #121
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Ubospica
changed the title
[Relax][Training] Update optimizer
[Relax][Training] Refactorize Optimizer and Gradient
Feb 7, 2023
Ubospica
changed the title
[Relax][Training] Refactorize Optimizer and Gradient
[Relax][Training] Refactor Optimizer and Gradient
Feb 7, 2023
Ubospica
force-pushed
the
mlc-dev/optimizer-update
branch
from
February 7, 2023 18:45
d4dee85
to
d3dc124
Compare
Ubospica
force-pushed
the
mlc-dev/optimizer-update
branch
from
February 7, 2023 18:47
d3dc124
to
25307a2
Compare
MasterJH5574
reviewed
Feb 7, 2023
Ubospica
force-pushed
the
mlc-dev/optimizer-update
branch
from
February 7, 2023 19:00
0364091
to
93808a7
Compare
20 tasks
MasterJH5574
approved these changes
Feb 7, 2023
MasterJH5574
pushed a commit
that referenced
this pull request
Feb 8, 2023
MasterJH5574
pushed a commit
that referenced
this pull request
Feb 8, 2023
Update optimizer APIs. - Remove `@property state` and `@state.setter` - Add `init()` interface - Remove `Optimizer.__call__()` - Remove underscores before attributes, and unnecessary attributes Current interfaces: ```python class Optimizer: dtype: str name: str param_list: List[Var] state: tvm.runtime.container.ADT def __init__(self, name: str) -> None: self.name = name self.param_list = None self.state = None self.dtype = None def init(self, params: Union[Var, List[Var]]) -> "Optimizer": """Set the parameters, determine the dtype, and build the initial state for the optimizer.""" pass def get_function(self) -> Function: """Use blockbuilder to build an optimizer function that executes updates of the parameters and the optimizer state.""" pass ``` Use examples: See <https://github.com/ACMClass-TVM-20/AD-Example/blob/dc255150dc6a4a6de2fffc2c093a8b2bacc1b030/optimizer_api_example.py> And also updates Gradient APIs: - Before: `def Gradient(global_var: GlobalVar, require_grads: Optional[Union[Var, List[Var]]]) -> tvm.ir.transform.Pass` - After: `def Gradient(func_name: str, require_grads: Optional[Union[Var, List[Var]]]) -> tvm.ir.transform.Pass` Unit tests are changed accordingly.
spectrometerHBH
pushed a commit
to spectrometerHBH/relax
that referenced
this pull request
Feb 9, 2023
MasterJH5574
pushed a commit
that referenced
this pull request
Feb 12, 2023
Update optimizer APIs. - Remove `@property state` and `@state.setter` - Add `init()` interface - Remove `Optimizer.__call__()` - Remove underscores before attributes, and unnecessary attributes Current interfaces: ```python class Optimizer: dtype: str name: str param_list: List[Var] state: tvm.runtime.container.ADT def __init__(self, name: str) -> None: self.name = name self.param_list = None self.state = None self.dtype = None def init(self, params: Union[Var, List[Var]]) -> "Optimizer": """Set the parameters, determine the dtype, and build the initial state for the optimizer.""" pass def get_function(self) -> Function: """Use blockbuilder to build an optimizer function that executes updates of the parameters and the optimizer state.""" pass ``` Use examples: See <https://github.com/ACMClass-TVM-20/AD-Example/blob/dc255150dc6a4a6de2fffc2c093a8b2bacc1b030/optimizer_api_example.py> And also updates Gradient APIs: - Before: `def Gradient(global_var: GlobalVar, require_grads: Optional[Union[Var, List[Var]]]) -> tvm.ir.transform.Pass` - After: `def Gradient(func_name: str, require_grads: Optional[Union[Var, List[Var]]]) -> tvm.ir.transform.Pass` Unit tests are changed accordingly.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Update optimizer APIs.
@property state
and@state.setter
init()
interfaceOptimizer.__call__()
Current interfaces:
Use examples:
See https://github.com/ACMClass-TVM-20/AD-Example/blob/dc255150dc6a4a6de2fffc2c093a8b2bacc1b030/optimizer_api_example.py
And also updates Gradient APIs:
def Gradient(global_var: GlobalVar, require_grads: Optional[Union[Var, List[Var]]]) -> tvm.ir.transform.Pass
def Gradient(func_name: str, require_grads: Optional[Union[Var, List[Var]]]) -> tvm.ir.transform.Pass
Unit tests are changed accordingly.