Skip to content

Commit

Permalink
修改教程文档
Browse files Browse the repository at this point in the history
  • Loading branch information
fangwei123456 committed Sep 16, 2020
1 parent 8e79b9e commit 169d90b
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/source/clock_driven/5_ann2snn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ SNN相比于ANN,产生的脉冲是离散的,这有利于高效的通信。

假定BatchNorm的参数为 :math:`\gamma` (``BatchNorm.weight``), :math:`\beta` (``BatchNorm.bias``), :math:`\mu` (``BatchNorm.running_mean``) ,
:math:`\sigma` (``BatchNorm.running_var``,:math:`\sigma = \sqrt{\mathrm{running\_var}}`)。具体参数定义详见
`torch.nn.batchnorm1d <https://pytorch.org/docs/stable/generated/torch.nn.BatchNorm2d.html#torch.nn.BatchNorm1d>`_ 。
`torch.nn.BatchNorm1d <https://pytorch.org/docs/stable/generated/torch.nn.BatchNorm2d.html#torch.nn.BatchNorm1d>`_ 。
参数模块(例如Linear)具有参数 :math:`W` 和 :math:`b` 。BatchNorm参数吸收就是将BatchNorm的参数通过运算转移到参数模块的 :math:`W`和 :math:`b` 中,使得数据输入新模块的输出和有BatchNorm时相同。
对此,新模型的 :math:`\bar{W}` 和 :math:`\bar{b}` 公式表示为:

Expand Down

0 comments on commit 169d90b

Please sign in to comment.