-
Notifications
You must be signed in to change notification settings - Fork 25.6k
Move WrapDimMinimal to c10 #14793
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Closed
Move WrapDimMinimal to c10 #14793
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Differential Revision: D13283495 Differential Version: 65035202
Differential Revision: D13283492 Differential Version: 65035196
Differential Revision: D13283493 Differential Version: 65035197
Differential Revision: D13283496 Differential Version: 65035200
Differential Revision: D13283494 Differential Version: 65035199
Differential Revision: D13283497 Differential Version: 65035198
Differential Revision: D13285370 Differential Version: 65047127
Differential Revision: D13288655 Differential Version: 65070335
Differential Revision: D13283495 Differential Version: 65268356
Differential Revision: D13283492 Differential Version: 65268357
Differential Revision: D13283493 Differential Version: 65268352
Differential Revision: D13283496 Differential Version: 65268353
Differential Revision: D13283494 Differential Version: 65268354
Differential Revision: D13283497 Differential Version: 65268358
Differential Revision: D13285370 Differential Version: 65268360
Differential Revision: D13288655 Differential Version: 65268355
Differential Revision: D13318596 Differential Version: 65308916
Differential Revision: D13318594 Differential Version: 65308915
Differential Revision: D13318644 Differential Version: 65309523
Differential Revision: D13318645 Differential Version: 65309522
Differential Revision: D13288655 Differential Version: 65309524
Differential Revision: D13318644 Differential Version: 65334572
Differential Revision: D13318645 Differential Version: 65334574
Differential Revision: D13288655 Differential Version: 65334577
Differential Revision: D13336843 Differential Version: 65432445
Differential Revision: D13336841 Differential Version: 65432443
This was referenced Dec 5, 2018
This was referenced Dec 5, 2018
Closed
Differential Revision: D13318644 Differential Version: 65565525
Differential Revision: D13318645 Differential Version: 65565534
Differential Revision: D13288655 Differential Version: 65565559
Differential Revision: D13336843 Differential Version: 65565564
Differential Revision: D13336841 Differential Version: 65565558
Differential Revision: D13318644 Differential Version: 65592900
Differential Revision: D13318645 Differential Version: 65592898
Differential Revision: D13288655 Differential Version: 65592968
Differential Revision: D13336843 Differential Version: 65592966
Differential Revision: D13336841 Differential Version: 65592974
dzhulgakov
reviewed
Dec 10, 2018
|
||
namespace at { | ||
|
||
static inline int64_t maybe_wrap_dim(int64_t dim, int64_t dim_post_expr, bool wrap_scalar=true) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why it can't be just using
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that wouldn't work with adding the other overloads below.
ezyang
approved these changes
Dec 10, 2018
Differential Revision: D13336841 Differential Version: 66011186
zdevito
pushed a commit
to zdevito/ATen
that referenced
this pull request
Dec 10, 2018
Summary: Pull Request resolved: pytorch/pytorch#14793 Reviewed By: ezyang Differential Revision: D13336841 fbshipit-source-id: 4365a799e1856cc68dd94a273e97663fee5f51db
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Stack:
:black_circle: #14793 Move WrapDimMinimal to c10 💚
:white_circle: #14794 Fix include path for WrapDimMinimal.h 💚
:white_circle: #14795 Move TensorImpl to c10 (yay!) 💚
:white_circle: #14816 Fix include paths for TensorImpl.h 💛
:white_circle: #14817 Move UndefinedTensorImpl to c10 (meh) 💛
:white_circle: #14818 Fix include paths for UndefinedTensorImpl.h 💛
:white_circle: #14819 Implement c10::Tensor 💛
:white_circle: #14820 Convert caffe2/aten Tensors to/from c10 💛
Differential Revision: D13336841