Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simplify sampling when independent variables are not changed #73

Open
jong42 opened this issue Sep 25, 2019 · 1 comment
Open

Simplify sampling when independent variables are not changed #73

jong42 opened this issue Sep 25, 2019 · 1 comment
Assignees
Labels

Comments

@jong42
Copy link
Collaborator

jong42 commented Sep 25, 2019

Previously it was possible to set an arbitrary number of posterior samples, so drawing without replacement made sense if the number of posterior samples was higher or lower than the number of training data points. However, after that an error was discovered that could appear
in certain models if the sizes of training data and posterior samples differed (see commit ed00453). Since then, the number of posterior samples was fixed to the size of the training data. This makes some of the previous strategies redundant.
When we do not change the independent variables anymore, the whole workaround with the
shared variables could be dropped

@jong42 jong42 added the PPL label Sep 25, 2019
@jong42 jong42 self-assigned this Sep 25, 2019
@nandaloo
Copy link
Member

I understand that the sampling could now be simplified, but I do see a serious problem with limiting the number of posterior samples in such a drastic way.
The quality of almost every query against the posterior distribution depends on the number of posterior samples, as this is the only way of accessing the distribution. We really should find a way to allow arbitrarily many posterior samples.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants