We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
irls_poisson_fast.cpp results in slightly different score values compared to the same model computed with glm.
irls_poisson_fast.cpp
glm
Compare mycache.mle with modglm in test-build_score_cache_mle.R .
mycache.mle
modglm
test-build_score_cache_mle.R
> mycache.mle$mlik [1] -1418.438 -Inf > logLik(modglm) 'log Lik.' -1410.645 (df=2)
Analogous for AIC and BIC scores.
I'm unsure if this variation in score values is expected.
This was temporarily fixed with an increased tolerance to pass the tests.
Double-check IRLS Poisson Fast algorithm. It has been shown that numerical overflow is not handled properly for large values of eta. Unsure if eta should ever be that large or if this was only caused by a faulty test. If the latter, consider catching such cases upstream properly and investigate why glm did not raise a warning.
The text was updated successfully, but these errors were encountered:
matteodelucchi
No branches or pull requests
irls_poisson_fast.cpp
results in slightly different score values compared to the same model computed withglm
.Steps to Reproduce
Compare
mycache.mle
withmodglm
intest-build_score_cache_mle.R
.Current Bug Behaviour
Analogous for AIC and BIC scores.
Actual expected Behaviour
I'm unsure if this variation in score values is expected.
Relevant Logs
This was temporarily fixed with an increased tolerance to pass the tests.
Possible Solutions
Double-check IRLS Poisson Fast algorithm. It has been shown that numerical overflow is not handled properly for large values of eta. Unsure if eta should ever be that large or if this was only caused by a faulty test. If the latter, consider catching such cases upstream properly and investigate why
glm
did not raise a warning.The text was updated successfully, but these errors were encountered: