Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Half type in randperm. #22102

Closed
wants to merge 15 commits into from
Closed

Conversation

xuhdev
Copy link
Collaborator

@xuhdev xuhdev commented Jun 22, 2019

Stack from ghstack:

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

Differential Revision: D16153586

@xuhdev
Copy link
Collaborator Author

xuhdev commented Jun 22, 2019

@pytorchbot retest this please

Support Half type in randperm.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
@xuhdev
Copy link
Collaborator Author

xuhdev commented Jun 22, 2019

@pytorchbot retest this please

Support Half type in randperm.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
@soumith soumith requested a review from gchanan June 25, 2019 03:40
@soumith soumith added triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module module: random Related to random number generation in PyTorch (rng generator) labels Jun 25, 2019
Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
aten/src/ATen/native/TensorFactories.h Outdated Show resolved Hide resolved
test/test_torch.py Outdated Show resolved Hide resolved
aten/src/ATen/native/TensorFactories.h Show resolved Hide resolved
Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
@xuhdev xuhdev requested a review from gchanan July 1, 2019 22:50
@gchanan
Copy link
Contributor

gchanan commented Jul 2, 2019

CI failures look unrelated.

Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
Support Half type in randperm.

Previously randperm supports Half type only on CUDA. This commit adds
Half support for the CPU version. Precision check for floating point
type is also added to ensure that integers can be properly represented.

gh-metadata: pytorch pytorch 22102 gh/xuhdev/1/head
@zou3519 zou3519 deleted the gh/xuhdev/1/head branch July 10, 2019 19:26
zdevito pushed a commit to zdevito/ATen that referenced this pull request Jul 10, 2019
Summary: Pull Request resolved: pytorch/pytorch#22102

Test Plan: Imported from OSS

Differential Revision: D16153586

Pulled By: li-roy

fbshipit-source-id: d58e3dbc5da893005f4eaf521a28b0d752274eff
@facebook-github-bot
Copy link
Contributor

@li-roy merged this pull request in 0f7c371.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Merged module: cuda Related to torch.cuda, and CUDA support in general module: random Related to random number generation in PyTorch (rng generator) open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants