You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this fork@Ubospica and me collaborated on the development of a training workflow on relax. We finished M0 and the efforts included:
An automatic differentiation pass on relax high level operators.
Some relax operators for training.
Gradients registered for some relax operators.
A framework supports training with different optimizers.
A trainer wrapper.
Now these results are being migrated to new struct info infra. During this migration, we also try our best to polish our previous work. Related PRs and Progress:
In this fork @Ubospica and me collaborated on the development of a training workflow on relax. We finished M0 and the efforts included:
Now these results are being migrated to new struct info infra. During this migration, we also try our best to polish our previous work. Related PRs and Progress:
collapse_sum
patch [TOPI] Expose the interface oftopi.collapse_sum
#102collapse_sum_to
,collapse_sum_like
[Op][Manip] collapse_sum_like, collapse_sum_to #87cross_entropy
,log_softmax
,nll_loss
[Op][NN] cross_entropy, log_softmax, nll_loss #94exp
[OP] Unary tensor operator R.exp #100abs
(For L1Loss) [Op] Completing the unary operators and refactor test #113log_softmax
andnll_loss
, Gradients forsplit
andconcat
.collapse_sum
erasure. [Op] Migration: Gradients for some operators #98The text was updated successfully, but these errors were encountered: