Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LearnNSE sbkt close to zero #124

Closed
paulorla opened this issue Jan 23, 2018 · 1 comment
Closed

LearnNSE sbkt close to zero #124

paulorla opened this issue Jan 23, 2018 · 1 comment

Comments

@paulorla
Copy link

Dear @ALL,

In the implementation of the LearnNSE available in the MOA 2017.06, there is a chance for the sbkt variable to get really close to zero, leading to the computation of the log of infinity when calculating the ensemble weights.

this.ensembleWeights.add(Math.log(1.0 / sbkt));

This leaded to problems in the Gaussian problem suggested by the original author of the Learn++.NSE: http://users.rowan.edu/~polikar/research/NSE/

As a workaround, one of the original authors of the NSE check if the "sbkt" is smaller than 0.01. If so, the value is set to 0.01.
It can be seen in: https://github.com/gditzler/IncrementalLearning/blob/master/src/learn_nse.m

Check the condition:

if net.beta(net.t,net.t)<net.threshold,
net.beta(net.t,net.t) = net.threshold;
end

It seems to solve the problem when implemented in the MOA version of the LearnNSE.

Best regars.

@abifet
Copy link
Collaborator

abifet commented Mar 3, 2018

Thanks! Can you propose a pull request with this fix?

@abifet abifet closed this as completed Mar 9, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants