Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initialise "success" variable to fix warnings #840

Merged
merged 1 commit into from
Feb 9, 2017

Conversation

rishabhkumar296
Copy link
Contributor

No description provided.

@rishabhkumar296
Copy link
Contributor Author

Any suggestions why the travis-ci build fails.

@rcurtin
Copy link
Member

rcurtin commented Dec 27, 2016

Don't worry about the travis failure, it is not related to your fix here. If you can fix the issue I pointed out, I am happy to merge these. Please feel free to add your name to the list of contributors in src/mlpack/core.hpp and COPYRIGHT.txt.

@@ -43,8 +43,6 @@ size_t RPlusTreeDescentHeuristic::ChooseDescentNode(TreeType* node,
node->Child(bestIndex).Bound();
bound |= node->Dataset().col(point);

success = true;

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The change that you've made means that during this loop, success is not reinitialized to true each iteration. So this introduces a bug.

@rcurtin
Copy link
Member

rcurtin commented Feb 9, 2017

I think you force pushed, so I did not see that there was a change. This looks better, thanks for the contribution. If you'd like to add your name to the list of contributors (even though this is a small contribution!) to src/mlpack/core.hpp and COPYRIGHT.txt, I'll merge that in too.

@rcurtin rcurtin merged commit 4b23012 into mlpack:master Feb 9, 2017
@rcurtin rcurtin added this to the mlpack 2.2.0 milestone Mar 17, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants