Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding the Log transformation approach from Osborne et al. 2012 #18

Merged
merged 15 commits into from Oct 21, 2021

Conversation

theogf
Copy link
Owner

@theogf theogf commented Mar 2, 2021

Base on the idea that we should approximate log(integrand(x)) instead of integrand(x)
See : Osborne, M. et al. Active Learning of Model Evidence Using Bayesian Quadrature. in Advances in Neural Information Processing Systems 2012.

Also fixed the issue with the variance which was given as a standard deviation

src/models.jl Outdated Show resolved Hide resolved
@johannesgiersdorf
Copy link
Collaborator

@theogf I could add the tests later if you find they make sense (atol=1e-1 is not so precise). Otherwise I think we can merge.

@theogf
Copy link
Owner Author

theogf commented Mar 26, 2021

Actually I am very uncertain about this implementation...
The paper has a lot of missing details and I think I might overlooked some of them. A group of students was supposed to work on the paper so I wanted to wait for their code to see if they did the same thing as I did

@theogf theogf merged commit 5e0b759 into main Oct 21, 2021
@theogf theogf deleted the log_correction branch October 21, 2021 11:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants