Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev norm op #5178

Merged
merged 26 commits into from
Jun 17, 2021
Merged

Dev norm op #5178

merged 26 commits into from
Jun 17, 2021

Conversation

puchapu
Copy link
Contributor

@puchapu puchapu commented Jun 11, 2021

背景

开发 nn.tripletmarginloss 过程中涉及求范数操作,之前实现方式直接在 loss 中实现部分范数计算。

新增 Norm 算子,与 pytorch 中 torch.linalg.norm 以及 numpy.linalg.norm 对齐。

Pytorch:https://pytorch.org/docs/stable/linalg.html#torch.linalg.norm
Numpy:https://numpy.org/doc/stable/reference/generated/numpy.linalg.norm.html

torch

之前也有一个类似未合并的 PR #4108

所需算子

  • flow.pow
  • flow.sqrt
  • flow.sum
  • flow.abs
  • flow.square
  • flow.max
  • flow.min
  • 缺少部分线性代数算子

存在疑问

  1. 纯 python 拼的算子能否满足性能上的要求(存在一些如奇异值分解的线性代数运算)
  2. pytorch 中有一些 norm 并不严格符合数学上对范数的描述,是否需要与 numpy 中对齐
  3. 输出维度上,在 dim = None 情况下,pytorch 输出为一个 shape 为 0 的 tensor(xxx),而 oneflow 默认的输出是一个 shape 为 1 的 tensor([xxx])。这个设置也会影响到 keepdim = True 是的输出判定情况(已在微信群和 @Flowingsun007 沟通记录过此问题)。

进度

  • 除需奇异值分解的范数均已对齐
  • 1D 和 2D 前后向对齐(见测试脚本)
  • keepdim 属性未完全对齐(现支持 dim 输入为 int 时的情况)
  • 高维矩阵(ND,N >= 3)的情况支持 “fro” norm
  • tensor.norm 与 linalg.norm 调用方式对齐
  • docstring
    docstring1
    docstring2
  • doctest
    doctest
  • test results
    test
  • tensor_test result
    test-tensor

@MARD1NO
Copy link
Contributor

MARD1NO commented Jun 13, 2021

#4108 之前写过一点norm的分析(PR还没合并),可以参考下

@oneflow-ci-bot oneflow-ci-bot self-requested a review June 16, 2021 10:05
Examples::

>>> import oneflow.experimental as flow
>>> from oneflow.experimental import linalg as LA
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这段代码你有测试过吗

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

嗯嗯,测过的。PR 一开始有结果的截图

@Ldpe2G Ldpe2G self-requested a review June 16, 2021 10:12


@oneflow_export("linalg.norm")
@register_tensor_op("linalg.norm")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

单测里面没测到 tensor.linglg.norm

@puchapu puchapu removed the fmt-only label Jun 16, 2021
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot June 16, 2021 10:15
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot June 17, 2021 07:52
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot June 17, 2021 08:40
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot June 17, 2021 09:33
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot June 17, 2021 11:33
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot June 17, 2021 12:39
@oneflow-ci-bot oneflow-ci-bot self-requested a review June 17, 2021 13:41
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot June 17, 2021 14:39
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot June 17, 2021 15:24
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot June 17, 2021 16:05
@oneflow-ci-bot oneflow-ci-bot merged commit 13cedfd into master Jun 17, 2021
@oneflow-ci-bot oneflow-ci-bot deleted the dev_tripletmarginloss branch June 17, 2021 17:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants