Skip to content

Commit

Permalink
Improve comments and documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
Ngoguey42 committed Oct 26, 2020
1 parent ffc643c commit 7dfad96
Showing 1 changed file with 13 additions and 15 deletions.
28 changes: 13 additions & 15 deletions buzzard/_footprint.py
Expand Up @@ -2301,8 +2301,8 @@ def __hash__(self):
def forward_conv2d(self, kernel_size, stride=1, padding=0, dilation=1):
"""Shift, scale and dilate the Footprint as if it went throught a 2d convolution kernel.
The arithmetic followed is the one from `pytorch`, but the arithmetics in other
deep-learning libraries are mostly the same.
The arithmetic followed is the one from `pytorch`, but other deep-learning libraries mostly
follow the same arithmetic.
This function is a `many to one` mapping, two footprints with different `rsizes` can produce
the same Footprint when `stride > 1`.
Expand Down Expand Up @@ -2341,16 +2341,16 @@ def forward_conv2d(self, kernel_size, stride=1, padding=0, dilation=1):
# *********************************************************************** **
# rf_rad: Receptive field radius (2,)
# pxlrvec: Pixel Left-Right Vector (2,)
# pxtbvev: Pixel Top-Bottprint Vector (2,)
# pxtbvev: Pixel Top-Bottom Vector (2,)
rf_rad = (kernel_size - 1) / 2
tl1 = (
fp0.tl

# A padding shift toward top-left
# Padding shifts toward top-left
- fp0.pxlrvec * padding[0]
- fp0.pxtbvec * padding[1]

# A convolution kernel shift toward bottom-left
# Kernel shifts toward bottom-right
+ fp0.pxlrvec * rf_rad[0]
+ fp0.pxtbvec * rf_rad[1]
)
Expand Down Expand Up @@ -2389,8 +2389,8 @@ def backward_conv2d(self, kernel_size, stride=1, padding=0, dilation=1):
"""Shift, scale and dilate the Footprint as if it went backward throught a 2d convolution
kernel.
The arithmetic followed is the one from `pytorch`, but the arithmetics in other
deep-learning libraries are mostly the same.
The arithmetic followed is the one from `pytorch`, but other deep-learning libraries mostly
follow the same arithmetic.
This function is a `one to one` mapping, two different input footprints will produce two
different output Footprints. It means that the `backward_conv2d` of a `forward_conv2d` may
Expand Down Expand Up @@ -2433,10 +2433,8 @@ def backward_conv2d(self, kernel_size, stride=1, padding=0, dilation=1):
rf_rad = (kernel_size - 1) / 2
tl0 = (
fp1.tl

+ fp1.pxlrvec / stride[0] * padding[0]
+ fp1.pxtbvec / stride[1] * padding[1]

- fp1.pxlrvec / stride[0] * rf_rad[0]
- fp1.pxtbvec / stride[1] * rf_rad[1]
)
Expand All @@ -2463,13 +2461,13 @@ def forward_convtranspose2d(self, kernel_size, stride=1, padding=0, dilation=1,
"""Shift, scale and dilate the Footprint as if it went throught a 2d transposed convolution
kernel.
The arithmetic followed is the one from `pytorch`, but the arithmetics in other
deep-learning libraries are mostly the same.
The arithmetic followed is the one from `pytorch`, but other deep-learning libraries mostly
follow the same arithmetic.
A 2d transposed convolution has 4 internal steps:
1. Apply stride (interleave the input pixels with zeroes)
1. Apply stride (i.e. interleave the input pixels with zeroes)
2. Add padding
3. Apply a 2d convolution stride:1, pad:0
3. Apply a 2d convolution with stride=1 and pad=0
4. Add output-padding
This function is a `one to one` mapping, two different input footprints will produce two
Expand Down Expand Up @@ -2557,8 +2555,8 @@ def backward_convtranspose2d(self, kernel_size, stride=1, padding=0, dilation=1,
"""Shift, scale and dilate the Footprint as if it went backward throught a 2d transposed
convolution kernel.
The arithmetic followed is the one from `pytorch`, but the arithmetics in other
deep-learning libraries are mostly the same.
The arithmetic followed is the one from `pytorch`, but other deep-learning libraries mostly
follow the same arithmetic.
A 2d transposed convolution has 4 internal steps:
1. Apply stride (interleave the input pixels with zeroes)
Expand Down

0 comments on commit 7dfad96

Please sign in to comment.