Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cross_val changes #29

Merged
merged 2 commits into from
Jul 8, 2017
Merged

cross_val changes #29

merged 2 commits into from
Jul 8, 2017

Conversation

rgupta90
Copy link
Collaborator

@rgupta90 rgupta90 commented Jul 8, 2017

I am not sure which is the best approach, commenting out the previous code or removing it. Thus, commented as of now.
Also, kept self._training_features and self._training_labels assigned as these are being used in functions in other python files, which are being called here.

@coveralls
Copy link

Coverage Status

Coverage increased (+2.08%) to 75.179% when pulling 75ccde7 on cross_val into 7ba5f63 on master.

Copy link
Owner

@lacava lacava left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think we can get rid of self._training_features. it is a legacy holdover and isn't used anymore.

few/few.py Outdated
@@ -213,7 +215,7 @@ def fit(self, features, labels):
initial_estimator = copy.deepcopy(self.ml.fit(x_t,y_t))
# self._best_estimator = copy.deepcopy(self.ml.fit(x_t,y_t))

self._best_score = self.ml.score(x_v,y_v)
self._best_score = np.amax(cross_val_score(self.ml,x_t,y_t,cv=3))
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use mean

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will do

few/few.py Outdated
#tmp_score = self.ml.score(self.transform(
# x_v,self.pop.individuals)[:,self.valid_loc()],
# y_v)
tmp_score = np.amax(cross_val_score(self.ml,x_t,y_t,cv=3))
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use mean

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will do

few/few.py Outdated
@@ -213,7 +215,7 @@ def fit(self, features, labels):
initial_estimator = copy.deepcopy(self.ml.fit(x_t,y_t))
# self._best_estimator = copy.deepcopy(self.ml.fit(x_t,y_t))

self._best_score = self.ml.score(x_v,y_v)
self._best_score = np.amax(cross_val_score(self.ml,x_t,y_t,cv=3))
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think you need to have cv=3 in there, that is the default

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, realized that later..will do

@rgupta90
Copy link
Collaborator Author

rgupta90 commented Jul 8, 2017

self._training_features is used in population.py in line 235 which is called in few.py line 50

few.py line 50
self.pop = self.init_pop(x_t.shape[0])

population.py line 235
for i,p in it.zip_longest(
range(self._training_features.shape[1]),
pop.individuals,fillvalue=None):

@coveralls
Copy link

Coverage Status

Coverage increased (+2.08%) to 75.179% when pulling 39e9323 on cross_val into 7ba5f63 on master.

@lacava
Copy link
Owner

lacava commented Jul 8, 2017

ok, let's leave _training_features in for now then.

@lacava lacava merged commit 28bed90 into master Jul 8, 2017
@lacava lacava deleted the cross_val branch October 6, 2017 18:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants