Skip to content

Conversation

albahnsen
Copy link
Contributor

Relater to #6
Add verbose in 4 levels:

`verbose=0` (default): prints nothing like right now
`verbose=1`: Print the number of the clf that is being fitted
`verbose=2`: Print info about the parameters of the clf that is being fitted
`verbose>2`:change the verbose of the  underlying clf to self.verbose - 2 

Example:

>>> import numpy as np
>>> from sklearn.linear_model import LogisticRegression
>>> from sklearn.naive_bayes import GaussianNB
>>> from sklearn.ensemble import RandomForestClassifier
>>> from mlxtend.sklearn import EnsembleClassifier
>>> clf1 = LogisticRegression(random_state=1)
>>> clf2 = RandomForestClassifier(random_state=1)
>>> clf3 = GaussianNB()
>>> X = np.array([[-1, -1], [-2, -1], [-3, -2], [1, 1], [2, 1], [3, 2]])
>>> y = np.array([1, 1, 1, 2, 2, 2])
>>> eclf1 = EnsembleClassifier(clfs=[clf1, clf2, clf3], voting='hard', verbose=3)
>>> eclf1 = eclf1.fit(X, y)
>>> print(eclf1.predict(X))
Fitting 3 classifiers...
Fitting clf1: logisticregression (1/3)
LogisticRegression(C=1.0, class_weight=None, dual=False, fit_intercept=True,
          intercept_scaling=1, max_iter=100, multi_class='ovr',
          penalty='l2', random_state=1, solver='liblinear', tol=0.0001,
          verbose=0)
[LibLinear]iter  1 act 2.675e+00 pre 2.380e+00 delta 7.534e-01 f 4.159e+00 |g| 7.211e+00 CG   1
iter  2 act 2.681e-01 pre 2.350e-01 delta 7.534e-01 f 1.484e+00 |g| 1.577e+00 CG   1
iter  3 act 1.622e-02 pre 1.555e-02 delta 7.534e-01 f 1.216e+00 |g| 3.205e-01 CG   1
iter  4 act 4.876e-04 pre 4.874e-04 delta 7.534e-01 f 1.200e+00 |g| 3.435e-02 CG   2
Fitting clf2: randomforestclassifier (2/3)
RandomForestClassifier(bootstrap=True, class_weight=None, criterion='gini',
            max_depth=None, max_features='auto', max_leaf_nodes=None,
            min_samples_leaf=1, min_samples_split=2,
            min_weight_fraction_leaf=0.0, n_estimators=10, n_jobs=1,
            oob_score=False, random_state=1, verbose=0, warm_start=False)
[Parallel(n_jobs=1)]: Done   1 jobs       | elapsed:    0.0s
[Parallel(n_jobs=1)]: Done  10 out of  10 | elapsed:    0.0s finished
[Parallel(n_jobs=1)]: Done   1 jobs       | elapsed:    0.0s
[Parallel(n_jobs=1)]: Done  10 out of  10 | elapsed:    0.0s finished
Fitting clf3: gaussiannb (3/3)
GaussianNB()
[1 1 1 2 2 2]

@albahnsen
Copy link
Contributor Author

@rasbt let me know what you think. I realize that is was important to have a different level of verbose between the ensemble function and the underlying clf. other than that, is what we discussed in #6

@coveralls
Copy link

Coverage Status

Coverage decreased (-0.29%) to 63.71% when pulling 51e751d on albahnsen:master into d895a04 on rasbt:master.

@rasbt
Copy link
Owner

rasbt commented May 27, 2015

@albahnsen That looks pretty good, thanks! I think there it only needs a tiny fix for the print statement before I can merge it.

@albahnsen
Copy link
Contributor Author

@rasbt happy to help 😄. to which fix are you referring?

@rasbt
Copy link
Owner

rasbt commented May 27, 2015

@albahnsen You can find the error when you click on the "Details" button for the travis-ci check. I think you just forgot the parentheses to make it compatible to Python 3:

 File "/home/travis/build/rasbt/mlxtend/mlxtend/sklearn/ensemble.py", line 128
    print _name_estimators((clf,))[0][1]
                      ^
SyntaxError: invalid syntax

@coveralls
Copy link

Coverage Status

Coverage decreased (-0.38%) to 63.62% when pulling 6bd0dce on albahnsen:master into d895a04 on rasbt:master.

@rasbt rasbt merged commit 6bd0dce into rasbt:master May 28, 2015
@rasbt
Copy link
Owner

rasbt commented May 28, 2015

Looks great, thanks a lot! I just made some tiny changes so that the verbose parameter of the individual classifiers is not set for the original ones but their clones.
Also, I swapped the order of

if self.verbose > 1:

and

if self.verbose > 2: 

so that the verbose parameter is shown correctly when printing out the params, e.g,.

>>> eclf1 = EnsembleClassifier(clfs=[clf1, clf2, clf3], voting='hard', verbose=3)
>>> eclf1 = eclf1.fit(X, y)
>>> print(eclf1.predict(X))


Fitting 3 classifiers...
Fitting clf1: logisticregression (1/3)
LogisticRegression(...
      verbose=1)
...

@albahnsen
Copy link
Contributor Author

That's great. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants