Skip to content

Commit

Permalink
Backward propagation -> Backpropagation
Browse files Browse the repository at this point in the history
  • Loading branch information
sublee committed Jun 20, 2019
1 parent 15476df commit 7de9a54
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion README.ko.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ GPipe는 Pipeline Parallelism과 Checkpointing, 두 가지 방법으로 가능
<dt>Checkpointing</dt>
<dd>각 파티션엔 체크포인트를 만들어 메모리 가용량을 극대화한다. 순전파(forward
propagation) 때 파티션 경계의 입출력만 기억하고 내부의 히든레이어는
휘발시킨다. 휘발된 히든레이어는 역전파(backward propagation) 때 다시
휘발시킨다. 휘발된 히든레이어는 역전파(backpropagation) 때 다시
계산된다.</dd>
</dl>

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ terminology in PyTorch community.
<dd>Checkpointing is applied to each partition to minimize the overall memory
consumption by a model. During forward propagation, only the tensors at the
boundaries between partitions are remembered. All other intermediate
tensors are volatilized, and recomputed during backward propagation when
tensors are volatilized, and recomputed during backpropagation when
necessary.</dd>
</dl>

Expand Down

0 comments on commit 7de9a54

Please sign in to comment.