Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing flaky test #1720

Open
wants to merge 2 commits into
base: develop
Choose a base branch
from
Open

Fixing flaky test #1720

wants to merge 2 commits into from

Conversation

crawlingcub
Copy link

This is a fix for the issue mentioned in #1708. It seems the -infs in the output were caused due to over-approximation for the bernoulli distribution. If I understand correctly, the num_gauss_hermite_points controls the level of approximation using the gaussian quadrature. If this number is high, some of the values during the computation of densities in variational_expectations is approximated as 1.0 (p). Hence, when you compute the log density for bernoulli here, it results in log(1-p) = log(1-1) = -inf.

I don't think this is a bug, but just a result of over-approximation as specified in the test. I re-run the test several times using 20 and it always passes. Please let me know if this seems reasonable or if you have any other thoughts/suggestions!

Thanks!

@codecov
Copy link

codecov bot commented Sep 28, 2021

Codecov Report

Merging #1720 (23b5ae2) into develop (ed6acf5) will decrease coverage by 0.09%.
The diff coverage is n/a.

Impacted file tree graph

@@             Coverage Diff             @@
##           develop    #1720      +/-   ##
===========================================
- Coverage    97.02%   96.93%   -0.10%     
===========================================
  Files           92       92              
  Lines         4407     4407              
===========================================
- Hits          4276     4272       -4     
- Misses         131      135       +4     
Impacted Files Coverage Δ
gpflow/likelihoods/base.py 95.29% <0.00%> (-2.36%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ed6acf5...23b5ae2. Read the comment docs.

@crawlingcub
Copy link
Author

It seems reducing the num_gauss_hermite_points may sometimes cause the assertions to fail. I have updated the tolerances in the assertions to account for this behavior.

@crawlingcub
Copy link
Author

Adding @st--
please let us know if these changes look ok. Here I am assuming that getting -inf when computing log densities with high num_gauss_hermite_points is due to over-approximations as some values are clamped at 1.0 (=> log (1-p) = -inf). Let me know if that makes sense.

@st--
Copy link
Member

st-- commented Oct 26, 2021

Hi @crawlingcub, thanks for looking into this (and bearing with slow replies!).:)

I'm a bit hesitant to merge changes that reduce the tolerances. Would it not be better to have a tighter tolerance, and a ~5% stochastic failure rate (and fixing the seed so it lands in the other 95%)? Willing to be convinced otherwise (but you still need to convince me why this change is important).

Are there alternative ways we could fix this? E.g. changing the tf.exp in line 54 to a softplus (log1pexp)?

@st-- st-- added the bugfix label Oct 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants