Skip to content

[Unity][Op] Gradient functions for high-level Relax operators#14527

Merged
tqchen merged 5 commits intoapache:unityfrom
SiriusNEO:unity-dev/2023-04-07-gradient-functions
Apr 8, 2023
Merged

[Unity][Op] Gradient functions for high-level Relax operators#14527
tqchen merged 5 commits intoapache:unityfrom
SiriusNEO:unity-dev/2023-04-07-gradient-functions

Conversation

@SiriusNEO
Copy link
Contributor

Intro

This PR registers gradient functions for many high-level Relax operators. Similar with Relay, the gradient function is registered as an attribute FPrimalGradient (OpAttr) of corresponding Relax operators. But the function signature is different from Relay:

using FPrimalGradient = runtime::TypedPackedFunc<tvm::Array<Expr>(
    const Var& orig_var, const Call& orig_call, const Var& output_grad, const BlockBuilder& ctx)>;
  • orig_call is the orginal call expr which we want to differentiate.

  • output_grad is the gradient of RHS.

  • orig_var is y. It is passed to saving some calculations.

  • ctx is the context which is not used right now. But we believe it is useful when it comes to dynamic shape cases and when we need to emit some bindings or do some normalizations.

For some complicate gradient functions, we introduce some high-level backward operators and put them under the namespace op.grad.xxx. All gradient functions are well tested (numerically). For more details please check Part 2 of this document.

Others

Also this PR fixes two small problems about op:

  • CumsumAttrs isn't declared in the Python side.
  • A small problem in the implementation about legalizing op variance.

Co-authored-by: Yixin Dong ubospica@gmail.com

@tvm-bot
Copy link
Collaborator

tvm-bot commented Apr 7, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

Generated by tvm-bot

@tqchen
Copy link
Member

tqchen commented Apr 7, 2023

@SiriusNEO one minor note, make sure you append the co-author commit message to the end

SiriusNEO and others added 3 commits April 8, 2023 07:34
Co-authored-by: Yixin Dong <ubospica@gmail.com>
@SiriusNEO SiriusNEO force-pushed the unity-dev/2023-04-07-gradient-functions branch from 2bfe776 to 7ecef07 Compare April 7, 2023 23:36
@tqchen tqchen merged commit d518238 into apache:unity Apr 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants