Skip to content

Conversation

@yaochengji
Copy link
Collaborator

@yaochengji yaochengji commented Dec 17, 2024

Fix #8496,

The test is put in a new class because the implementation of batch_norm amp is different from others (XLA requires the input dtypes of batch_norm to be all the same).

@qihqi qihqi self-requested a review January 6, 2025 23:39
@yaochengji yaochengji merged commit 1cdb1ef into pytorch:master Jan 8, 2025
12 checks passed
@yaochengji
Copy link
Collaborator Author

@qihqi tpu-ci label was not added in this PR and the current CI keeps failing because of this.

Should we revert this PR and fixed the tpu-ci issue later?

tengyifei added a commit that referenced this pull request Jan 9, 2025
qihqi pushed a commit that referenced this pull request Jan 16, 2025
Co-authored-by: Chengji Yao <chengji.yao@bytedance.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

AMP BF16 issue with batch norm layer

2 participants