Skip to content

Conversation

@guillaumekln
Copy link
Contributor

Fixes #781.

@guillaumekln guillaumekln changed the title Update expected attention mean following precision changes in sigmoid [WIP] Update expected attention mean following precision changes in sigmoid Dec 20, 2019
@guillaumekln
Copy link
Contributor Author

guillaumekln commented Dec 20, 2019

I was focused on GPU tests but the expected mean on CPU did not change. I'm looking for how to test this effectively on both CPU and GPU.

@qlzh727
Copy link
Member

qlzh727 commented Jan 6, 2020

Thanks for taking care of this issue while I am away.

If the current issue is that GPU and CPU has different precision, you probably can split the test with

if test_utils.is_gpu_available():
....

@guillaumekln
Copy link
Contributor Author

See #1085.

@guillaumekln guillaumekln deleted the update-expected-mean-after-precision-changes branch June 9, 2020 08:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

AttentionWrapperTest results failing on nightlies

3 participants