Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

split vector-matrix norm #5478

Merged
merged 17 commits into from
Jul 16, 2021
Merged

split vector-matrix norm #5478

merged 17 commits into from
Jul 16, 2021

Conversation

puchapu
Copy link
Contributor

@puchapu puchapu commented Jul 13, 2021

  • 拆分 matrix norm 与 vector norm,满足 clip_by_norm 对齐需求。
  • torch.norm 后续会弃用,因此选择与 torch.linalg.norm 接口对齐。
  • vector_norm 支持 float 类型 order
  • doctest
    doctest
  • matrix_norm
    matrix_norm
  • vector_norm
    vector_norm1

def __init__(self, ord=None, dim=None, keepdim=False) -> None:
super().__init__()

self.ord = ord
self.ord = 2 if ord == None else ord
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

后续感觉还需要一个if else判断,让他变成float输入到后续

if not isinstance(ord, str): 
    ord = float(ord)

),
1.0 / ord,
)

elif isinstance(ord, int):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个分支处理没有必要,前面做一个float转换。因为实际上int分支和float分支的计算逻辑是一样的

@@ -51,6 +58,19 @@ def _vector_norm(self, x, ord, dim):
else:
raise ValueError("Invalid norm order: {}".format(ord))

def forward(self, x):
return self._vector_norm(x.reshape((1, -1))[0], ord = self.ord, dim=self.dim)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reshape两维是不正确的,torch是可以支持多维的。

def __init__(self, ord=None, dim=None, keepdim=False) -> None:
super().__init__()

self.ord = ord
self.ord = 2 if ord == None else ord
self.dim = dim
self.keepdim = keepdim

def _vector_norm(self, x, ord, dim):
if isinstance(ord, str) and ord in ["fro", "nuc"]:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

还需要检查dim是否超过了len(x.shape)

self.ord = ord
if ord == None:
self.ord = 2.0
elif isinstance(ord, (int, float)):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

float情况可以去除

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

去除了 float 的话,传入 float 就会报错了吧

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

噢噢那这里不用去



class Vector_Norm(Module):
def __init__(self, ord, dim, keepdim) -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

num_dims = len(x.shape)
dim = check_dim(num_dims, self.dim)
if dim == None:
return self._vector_norm(x.reshape((1, -1))[0], ord = self.ord, dim=self.dim, keepdim= self.keepdim)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reshape((1, -1))这里可以改成flatten吧

oneflow/python/nn/modules/norm.py Outdated Show resolved Hide resolved
def _matrix_norm(self, x, ord, dim):

class Matrix_Norm(Module):
def __init__(self, ord, dim, keepdim) -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

raise TypeError("linalg_matrix_norm(): argument 'ord' must be Number, not {}".format(type(ord)))
if isinstance(dim,tuple) and len(dim) == 2 and dim[0] != dim[1]:
self.dim = dim
elif dim == None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这部分按照前面设置好默认值,可以去掉这部分判断逻辑,不用判断dim==None

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这边感觉不能去掉吧,linalg.norm 的默认 dim == None,在那个里面调 Matrix_norm 的话就会把 none 传进来了(line 157)。或者就是我在 linalg.norm 里判断一下

raise NotImplementedError
elif ord == "fro":
return flow.experimental.sqrt(
flow.experimental.sum(flow.experimental.square(x), dim=dim, keepdim= keepdim)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

keepdim= 后面有个空格


def _norm_min_max(input, ord,dim,keepdim):
if ord > 0:
return flow.experimental.max(input, dim= dim,keepdim = keepdim)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

=后面这些空格都去掉


def _norm_min_max(input, ord,dim,keepdim):
if ord > 0:
return flow.experimental.max(input, dim= dim,keepdim = keepdim)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

=后面这些空格都去掉

self.ord = ord
if ord == None:
self.ord = 2.0
elif isinstance(ord, (int, float)):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

噢噢那这里不用去

@puchapu puchapu requested a review from oneflow-ci-bot July 16, 2021 03:06
@puchapu puchapu added the python label Jul 16, 2021
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot July 16, 2021 03:09
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot July 16, 2021 04:41
@oneflow-ci-bot oneflow-ci-bot self-requested a review July 16, 2021 05:40
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot July 16, 2021 07:44
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot July 16, 2021 10:45
@oneflow-ci-bot oneflow-ci-bot self-requested a review July 16, 2021 12:13
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot July 16, 2021 13:37
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot July 16, 2021 15:23
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot July 16, 2021 17:14
@oneflow-ci-bot oneflow-ci-bot requested review from oneflow-ci-bot and removed request for oneflow-ci-bot July 16, 2021 18:43
@oneflow-ci-bot oneflow-ci-bot merged commit ec0d02c into master Jul 16, 2021
@oneflow-ci-bot oneflow-ci-bot deleted the split-norm branch July 16, 2021 21:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants