-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Regarding the log prior (phi_0 in the paper) #7
Comments
Good catch, some additional background (some lines are removed for easier reading, line numbers from BoostSRL's current master branch are marked)
public class ConditionalModelPerPredicate implements Serializable {
/**
* Prior log probability i.e. \psi_0
*/
private double log_prior = -1.8; // Line 49
public void setLog_prior(double logPrior) { // Line 589
log_prior = logPrior;
} Setting the log prior to 0 in this repository may be left over from when regression was implemented. The BoostSRL codebase sets the log prior to 0 when a regression model is learned.
if (cmdArgs.isLearnRegression()) {
rdn.setLog_prior(0); // 171
} |
None of the unit tests in this repository check For regression we may want to do something similar to the Java codebase where the prior can be adjusted, maybe as a function in - __logPrior__ = log(0.5/float(1-0.5))
+ __logPrior__ = -1.8
+
+ def setLogPrior(prior):
+ __logPrior__ = prior I looked back through the original RDN-Boost paper but didn't find this specific value referenced there. @boost-starai or @gkunapuli may know where this originated from. |
-1.8 is the log prior for 0.5,0.5.
|
about the function performInference in boosting.py during the inference:
logPrior = log(0.5/float(1-0.5)) = 0
However, in the java version, this is -1.8.
about the functions: setPos and setNeg in util.py during learning:
However, in the java version:
The text was updated successfully, but these errors were encountered: