fix custom op for backward compatibility #8721
Conversation
Thank you for fixing this and adding the tests! Yes, since the output stypes which is the in_grads would depend on arguments length and not output length. Can you please let me know what LR example you are referring to? |
@@ -3652,6 +3652,42 @@ def create_operator(self, ctx, shapes, dtypes): | |||
assert (x.grad.stype == 'csr') | |||
assert (y.stype == 'csr') | |||
assert (aux.stype == 'csr') | |||
|
|||
# test for backward compatibility, i.e. the correctness of default implementation of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please remove whitespaces here and above.
@anirudh2290 Thanks for your comments. LR example is the |
@mx.operator.register("mult") | ||
class MultProp(mx.operator.CustomOpProp): | ||
def __init__(self): | ||
super(MultProp, self).__init__(need_top_grad=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Found an issue with custom_op when used with need_top_grad=False. Fixing here: #8725
Closing this because we are reverting sparse support for custom op |
Description
The default implementation
infer_storage_type_backward
in custom op has some problem.Before this PR, the LR in sparse example is broken due to the custom op, and the added test will throw err message,
cc @eric-haibin-lin @anirudh2290
Checklist
Essentials
make lint
)Changes
Comments