Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix typo for interp_v2,test=develop #26843

Merged
merged 14 commits into from
Sep 8, 2020

Conversation

tink2123
Copy link
Contributor

@tink2123 tink2123 commented Aug 31, 2020

PR types

Bug fixes

PR changes

APIs

Describe

  • support tuple type for scale_factors
  • fix typo
  • add notes for v2_op
  • Align with torch's calculation
  • rename UpSample to Upsample

@paddle-bot-old
Copy link

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

LDOUBLEV
LDOUBLEV previously approved these changes Sep 3, 2020
Copy link
Contributor

@LDOUBLEV LDOUBLEV left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -114,6 +115,8 @@ def interpolate(x,
smoother than corresponding surfaces obtained by bilinear interpolation or
nearest-neighbor interpolation.

Area interpolation is same as adaptive avg pooling.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

elaborate?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Area 插值 找不到比较官方的概念介绍

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should be either removed or revised, as interpolation is definitely not the same as pooling.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok,has been removed.

Copy link
Contributor

@jzhang533 jzhang533 Sep 7, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please also delete line 91
and implementation codes in line 287, and line 316 ~ line 322
please run a grep on the file

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

polish the notes, please review~

willthefrog
willthefrog previously approved these changes Sep 4, 2020
@jzhang533
Copy link
Contributor

jzhang533 commented Sep 7, 2020

in_d, in_h, in_w -> depth, height, width
or
be consistent with NCW, NWC, ...

And :attr:`out_shape` has a higher priority than :attr:`scale_factor`.Has to match input size if it is a list.
scale_factor (float|Tensor|list|tuple|None): The multiplier for the input height or width. At
least one of :attr:`size` or :attr:`scale_factor` must be set.
And :attr:`size` has a higher priority than :attr:`scale_factor`.Has to match input size if it is a list.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:attr:scale_factor.Has to match input size if it is a list.

是tensor的时候也要match吧。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, thx

And :attr:`out_shape` has a higher priority than :attr:`scale_factor`.Has to match input size if it is a list.
scale_factor (float|Tensor|list|tuple|None): The multiplier for the input height or width. At
least one of :attr:`size` or :attr:`scale_factor` must be set.
And :attr:`size` has a higher priority than :attr:`scale_factor`.Has to match input size if it is a list.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see above

And :attr:`out_shape` has a higher priority than :attr:`scale_factor`.
scale_factor (float|int|list|tuple|Tensor|None): The multiplier for the input height or width. At
least one of :attr:`size` or :attr:`scale_factor` must be set.
And :attr:`size` has a higher priority than :attr:`scale_factor`.
Default: None. Has to match input size if it is a list.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see above

Copy link
Contributor

@jzhang533 jzhang533 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Copy link
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ceci3 ceci3 merged commit 58f3ef9 into PaddlePaddle:develop Sep 8, 2020
tink2123 added a commit to tink2123/Paddle that referenced this pull request Sep 8, 2020
* fix typo for interp_v2,test=develop

* align with torch, test=develop

* add area mode, test=develop

* fix bug, test=develop

* format notes, test=develop

* update for converage, test=develop

* fix bilinear, test=develop

* fix bicubic, test=develop

* fix typo, test=develop

* fix coverage, test=develop

* fix helper.input_dtype, test=develop

* polish notes, test=develop

* polish notes, test=develop

* polish notes, test=develop
ceci3 pushed a commit that referenced this pull request Sep 8, 2020
* fix typo for interp_v2,test=develop

* align with torch, test=develop

* add area mode, test=develop

* fix bug, test=develop

* format notes, test=develop

* update for converage, test=develop

* fix bilinear, test=develop

* fix bicubic, test=develop

* fix typo, test=develop

* fix coverage, test=develop

* fix helper.input_dtype, test=develop

* polish notes, test=develop

* polish notes, test=develop

* polish notes, test=develop
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants