Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug fix for pytorch linear non float datatype #35

Closed
wants to merge 1 commit into from

Conversation

nrsatish
Copy link
Contributor

Summary: Bug fix for dt (such as torch.float16) being passed into train() while it expects a string such as "float16"

Reviewed By: jianyuh

Differential Revision: D32437286

Summary: Bug fix for dt (such as torch.float16) being passed into train() while it expects a string such as "float16"

Reviewed By: jianyuh

Differential Revision: D32437286

fbshipit-source-id: 4e4697188950b09bea90b70b13a482d601763f6c
@facebook-github-bot facebook-github-bot added CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported labels Nov 15, 2021
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D32437286

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 595b81c.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants