Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -25,65 +25,35 @@ PyTorch 相比 Paddle 支持更多其他参数,具体如下:
| reduction | reduction | 表示应用于输出结果的计算方式。 |

### 转写示例


#### size_average
size_average 为 True
#### reduction 为 sum
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

一共有34种用法,还有torch.nn.functional.* 下面有17个

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个路径下的loss,我看paddle都没支持size_average,reduce参数,要支持一下吗

Copy link
Collaborator

@zhwesky2010 zhwesky2010 Nov 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个路径下的loss,我看paddle都没支持size_average,reduce参数,要支持一下吗

需要支持。同时之前的的legacy_reduction_decorator也需要在loss.py的functional API里都支持下。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

由于loss.py下都是函数(之前的17个loss是类),所以需要在原来的legacy_reduction_decorator里新增一个判断,因为原来算位置参数需要跳过self,如果是函数的话就不用跳过,会引入新的开销;或者就再写一个装饰器,不过大部分代码都可以复用,您看哪个好一点

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

由于loss.py下都是函数(之前的17个loss是类),所以需要在原来的legacy_reduction_decorator里新增一个判断,因为原来算位置参数需要跳过self,如果是函数的话就不用跳过,会引入新的开销;或者就再写一个装饰器,不过大部分代码都可以复用,您看哪个好一点

@zhwesky2010 您有空了看一下这个哈

```python
# PyTorch 写法
torch.nn.BCELoss(size_average=True)
torch.nn.BCELoss(weight=w, size_average=False, reduce=True)
torch.nn.BCELoss(weight=w, size_average=False)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

还有一种写法
torch.nn.BCELoss(weight=w, reduction='sum)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里我看不是之前都只给size_average+reduce到reduction的转写示例嘛,reduction到reduction是一一对应的关系,还要写出来吗

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个也可以不写,是一一对应的


# Paddle 写法
paddle.nn.BCELoss(reduction='mean')
paddle.nn.BCELoss(weight=w, reduction='sum')
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

注释一下:

## 以上写法都统一对应到如下写法

```

size_average 为 False
#### reduction 为 mean
```python
# PyTorch 写法
torch.nn.BCELoss(size_average=False)
torch.nn.BCELoss(weight=w, size_average=True, reduce=True)
torch.nn.BCELoss(weight=w, reduce=True)
torch.nn.BCELoss(weight=w, size_average=True)
torch.nn.BCELoss(weight=w)

# Paddle 写法
paddle.nn.BCELoss(reduction='sum')
paddle.nn.BCELoss(weight=w, reduction='mean')
```
#### reduce
reduce 为 True
```python
# PyTorch 写法
torch.nn.BCELoss(reduce=True)

# Paddle 写法
paddle.nn.BCELoss(reduction='sum')
```
reduce 为 False
```python
# PyTorch 写法
torch.nn.BCELoss(reduce=False)

# Paddle 写法
paddle.nn.BCELoss(reduction='none')
```
#### reduction
reduction 为'none'
```python
# PyTorch 写法
torch.nn.BCELoss(reduction='none')

# Paddle 写法
paddle.nn.BCELoss(reduction='none')
```
reduction 为'mean'
```python
# PyTorch 写法
torch.nn.BCELoss(reduction='mean')

# Paddle 写法
paddle.nn.BCELoss(reduction='mean')
```
reduction 为'sum'
#### reduction 为 none
```python
# PyTorch 写法
torch.nn.BCELoss(reduction='sum')
torch.nn.BCELoss(weight=w, size_average=True, reduce=False)
torch.nn.BCELoss(weight=w, size_average=False, reduce=False)
torch.nn.BCELoss(weight=w, reduce=False)

# Paddle 写法
paddle.nn.BCELoss(reduction='sum')
paddle.nn.BCELoss(weight=w, reduction='none')
```
Original file line number Diff line number Diff line change
Expand Up @@ -28,65 +28,35 @@ PyTorch 相比 Paddle 支持更多其他参数,具体如下:
| pos_weight | pos_weight | 表示正类的权重。 |

### 转写示例


#### size_average
size_average 为 True
#### reduction 为 sum
```python
# PyTorch 写法
torch.nn.BCEWithLogitsLoss(size_average=True)
torch.nn.BCEWithLogitsLoss(weight=w, size_average=False, reduce=True)
torch.nn.BCEWithLogitsLoss(weight=w, size_average=False)

# Paddle 写法
paddle.nn.BCEWithLogitsLoss(reduction='mean')
paddle.nn.BCEWithLogitsLoss(weight=w, reduction='sum')
```

size_average 为 False
#### reduction 为 mean
```python
# PyTorch 写法
torch.nn.BCEWithLogitsLoss(size_average=False)
torch.nn.BCEWithLogitsLoss(weight=w, size_average=True, reduce=True)
torch.nn.BCEWithLogitsLoss(weight=w, reduce=True)
torch.nn.BCEWithLogitsLoss(weight=w, size_average=True)
torch.nn.BCEWithLogitsLoss(weight=w)

# Paddle 写法
paddle.nn.BCEWithLogitsLoss(reduction='sum')
paddle.nn.BCEWithLogitsLoss(weight=w, reduction='mean')
```
#### reduce
reduce 为 True
```python
# PyTorch 写法
torch.nn.BCEWithLogitsLoss(reduce=True)

# Paddle 写法
paddle.nn.BCEWithLogitsLoss(reduction='sum')
```
reduce 为 False
```python
# PyTorch 写法
torch.nn.BCEWithLogitsLoss(reduce=False)

# Paddle 写法
paddle.nn.BCEWithLogitsLoss(reduction='none')
```
#### reduction
reduction 为'none'
```python
# PyTorch 写法
torch.nn.BCEWithLogitsLoss(reduction='none')

# Paddle 写法
paddle.nn.BCEWithLogitsLoss(reduction='none')
```
reduction 为'mean'
```python
# PyTorch 写法
torch.nn.BCEWithLogitsLoss(reduction='mean')

# Paddle 写法
paddle.nn.BCEWithLogitsLoss(reduction='mean')
```
reduction 为'sum'
#### reduction 为 none
```python
# PyTorch 写法
torch.nn.BCEWithLogitsLoss(reduction='sum')
torch.nn.BCEWithLogitsLoss(weight=w, size_average=True, reduce=False)
torch.nn.BCEWithLogitsLoss(weight=w, size_average=False, reduce=False)
torch.nn.BCEWithLogitsLoss(weight=w, reduce=False)

# Paddle 写法
paddle.nn.BCEWithLogitsLoss(reduction='sum')
paddle.nn.BCEWithLogitsLoss(weight=w, reduction='none')
```
Original file line number Diff line number Diff line change
Expand Up @@ -21,65 +21,35 @@ PyTorch 相比 Paddle 支持更多其他参数,具体如下:
| reduction | reduction | 指定应用于输出结果的计算方式。 |

### 转写示例


#### size_average
size_average 为 True
#### reduction 为 sum
```python
# PyTorch 写法
torch.nn.CosineEmbeddingLoss(size_average=True)
torch.nn.CosineEmbeddingLoss(margin=m, size_average=False, reduce=True)
torch.nn.CosineEmbeddingLoss(margin=m, size_average=False)

# Paddle 写法
paddle.nn.CosineEmbeddingLoss(reduction='mean')
paddle.nn.CosineEmbeddingLoss(margin=m, reduction='sum')
```

size_average 为 False
#### reduction 为 mean
```python
# PyTorch 写法
torch.nn.CosineEmbeddingLoss(size_average=False)
torch.nn.CosineEmbeddingLoss(margin=m, size_average=True, reduce=True)
torch.nn.CosineEmbeddingLoss(margin=m, reduce=True)
torch.nn.CosineEmbeddingLoss(margin=m, size_average=True)
torch.nn.CosineEmbeddingLoss(margin=m)

# Paddle 写法
paddle.nn.CosineEmbeddingLoss(reduction='sum')
paddle.nn.CosineEmbeddingLoss(margin=m, reduction='mean')
```
#### reduce
reduce 为 True
```python
# PyTorch 写法
torch.nn.CosineEmbeddingLoss(reduce=True)

# Paddle 写法
paddle.nn.CosineEmbeddingLoss(reduction='sum')
```
reduce 为 False
```python
# PyTorch 写法
torch.nn.CosineEmbeddingLoss(reduce=False)

# Paddle 写法
paddle.nn.CosineEmbeddingLoss(reduction='none')
```
#### reduction
reduction 为'none'
```python
# PyTorch 写法
torch.nn.CosineEmbeddingLoss(reduction='none')

# Paddle 写法
paddle.nn.CosineEmbeddingLoss(reduction='none')
```
reduction 为'mean'
```python
# PyTorch 写法
torch.nn.CosineEmbeddingLoss(reduction='mean')

# Paddle 写法
paddle.nn.CosineEmbeddingLoss(reduction='mean')
```
reduction 为'sum'
#### reduction 为 none
```python
# PyTorch 写法
torch.nn.CosineEmbeddingLoss(reduction='sum')
torch.nn.CosineEmbeddingLoss(margin=m, size_average=True, reduce=False)
torch.nn.CosineEmbeddingLoss(margin=m, size_average=False, reduce=False)
torch.nn.CosineEmbeddingLoss(margin=m, reduce=False)

# Paddle 写法
paddle.nn.CosineEmbeddingLoss(reduction='sum')
paddle.nn.CosineEmbeddingLoss(margin=m, reduction='none')
```
Original file line number Diff line number Diff line change
Expand Up @@ -36,65 +36,35 @@ PyTorch 相比 Paddle 支持更多其他参数,具体如下:
| - | axis | 进行 softmax 计算的维度索引,PyTorch 无此参数,Paddle 保持默认即可。 |

### 转写示例


#### size_average
size_average 为 True
#### reduction 为 sum
```python
# PyTorch 写法
torch.nn.CrossEntropyLoss(size_average=True)
torch.nn.CrossEntropyLoss(weight=w, size_average=False, reduce=True)
torch.nn.CrossEntropyLoss(weight=w, size_average=False)

# Paddle 写法
paddle.nn.CrossEntropyLoss(reduction='mean')
paddle.nn.CrossEntropyLoss(weight=w, reduction='sum')
```

size_average 为 False
#### reduction 为 mean
```python
# PyTorch 写法
torch.nn.CrossEntropyLoss(size_average=False)
torch.nn.CrossEntropyLoss(weight=w, size_average=True, reduce=True)
torch.nn.CrossEntropyLoss(weight=w, reduce=True)
torch.nn.CrossEntropyLoss(weight=w, size_average=True)
torch.nn.CrossEntropyLoss(weight=w)

# Paddle 写法
paddle.nn.CrossEntropyLoss(reduction='sum')
paddle.nn.CrossEntropyLoss(weight=w, reduction='mean')
```
#### reduce
reduce 为 True
```python
# PyTorch 写法
torch.nn.CrossEntropyLoss(reduce=True)

# Paddle 写法
paddle.nn.CrossEntropyLoss(reduction='sum')
```
reduce 为 False
```python
# PyTorch 写法
torch.nn.CrossEntropyLoss(reduce=False)

# Paddle 写法
paddle.nn.CrossEntropyLoss(reduction='none')
```
#### reduction
reduction 为'none'
```python
# PyTorch 写法
torch.nn.CrossEntropyLoss(reduction='none')

# Paddle 写法
paddle.nn.CrossEntropyLoss(reduction='none')
```
reduction 为'mean'
```python
# PyTorch 写法
torch.nn.CrossEntropyLoss(reduction='mean')

# Paddle 写法
paddle.nn.CrossEntropyLoss(reduction='mean')
```
reduction 为'sum'
#### reduction 为 none
```python
# PyTorch 写法
torch.nn.CrossEntropyLoss(reduction='sum')
torch.nn.CrossEntropyLoss(weight=w, size_average=True, reduce=False)
torch.nn.CrossEntropyLoss(weight=w, size_average=False, reduce=False)
torch.nn.CrossEntropyLoss(weight=w, reduce=False)

# Paddle 写法
paddle.nn.CrossEntropyLoss(reduction='sum')
paddle.nn.CrossEntropyLoss(weight=w, reduction='none')
```
Loading