Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about GMM::probability_of #3

Open
eroehrig opened this issue May 9, 2018 · 3 comments
Open

Question about GMM::probability_of #3

eroehrig opened this issue May 9, 2018 · 3 comments

Comments

@eroehrig
Copy link

eroehrig commented May 9, 2018

fast-gmm/src/gmm.cc

Lines 232 to 235 in 5b6940d

real_t GMM::probability_of(std::vector<real_t> &x) {
real_t prob = 0;
for (int i = 0; i < nr_mixtures; i ++) {
prob *= weights[i] * gaussians[i]->probability_of(x);

It should be prob += weights[i] * gaussians[i]->probability_of(x), shouldn't it be?

@zxytim
Copy link
Owner

zxytim commented May 10, 2018

No.
Probabilities are multiplicative, while log-probabilities are additive.
See

prob += log_probability_of(x);

@eroehrig
Copy link
Author

I think the probabilities of the different gaussians are additive (see at wikipedia).
The example you gave me, is the sum of the log_probability of more than one datapoint. That refers to

fast-gmm/src/gmm.cc

Lines 215 to 220 in 5b6940d

real_t GMM::log_probability_of(std::vector<real_t> &x) {
real_t prob = 0;
for (int i = 0; i < nr_mixtures; i ++) {
prob += weights[i] * gaussians[i]->probability_of(x);
}
return log(prob);

where it is done correctly. Because the probabilities are summed up before the log operation. There should be no difference to GMM::probability_of except of the log operation in the end.

@AlexanderFabisch
Copy link

A Gaussian mixture model is a weighted sum of Gaussians. I would say + is correct here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants