Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add optim #927

Merged
merged 1 commit into from Jan 11, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
7 changes: 7 additions & 0 deletions docs/pytorch_project_convertor/API_docs/optimizer/README.md
@@ -0,0 +1,7 @@
## 优化器与学习率API映射列表

该文档梳理了与优化器与学习率相关的PyTorch-PaddlePaddle API映射列表。

| 序号 | PyTorch API | PaddlePaddle API | 备注 |
| ---- | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------- |
| 1 | [torch.optim.lr_scheduler.LRScheduler](https://pytorch.org/docs/master/_modules/torch/optim/lr_scheduler.html#LRScheduler) | [paddle.optimizer.lr.LRScheduler](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/lr/LRScheduler_cn.html) | [差异对比](https://github.com/PaddlePaddle/X2Paddle/tree/develop/docs/pytorch_project_convertor/API_docs/optimizer/torch.optim.lr_scheduler.LRScheduler.md)
@@ -0,0 +1,57 @@
## torch.optim.lr_scheduler.LRScheduler

### [torch.optim.lr_scheduler.LRScheduler](https://pytorch.org/docs/master/_modules/torch/optim/lr_scheduler.html#LRScheduler)

```python
torch.optim.lr_scheduler.LRScheduler(optimizer, last_epoch=-1, verbose=False)
```

### [paddle.optimizer.lr.LRScheduler](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/lr/LRScheduler_cn.html)

```python
paddle.optimizer.lr.LRScheduler(learning_rate=0.1, last_epoch=- 1, verbose=False)
```

### 参数差异

| PyTorch | PaddlePaddle | 备注 |
| ------------- | ------------- | ----------------------------------------------------- |
| optimizer | - | 优化器, PaddlePaddle 无此参数。 |
| - | learning_rate | 学习率, Pytorch 无此参数。 |

### 使用差异

#### PyTorch

```python
import torch
import torch.nn as nn
import torch.optim as optim
from torch.optim import lr_scheduler

model = nn.Linear(10, 10)
optimizer = optim.SGD(model.parameters(), lr=0.1)
scheduler = lr_scheduler.StepLR(optimizer, step_size=30, gamma=0.1)
for epoch in range(100):
train(...)
validate(...)
scheduler.step()
```

#### PaddlePaddle

```python
import paddle
import paddle.nn as nn
import paddle.optimizer as optim
from paddle.optimizer.lr import StepDecay

model = nn.Linear(10, 10)
scheduler = StepDecay(learning_rate=0.1, step_size=30, gamma=0.1)
optimizer = optim.SGD(learning_rate=scheduler, parameters=model.parameters())

for epoch in range(100):
train(...)
validate(...)
scheduler.step()
```