Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add mutator and accessor functions for some layers #2121

Merged
merged 4 commits into from Dec 24, 2019

Conversation

sreenikSS
Copy link
Contributor

  1. Add accessor and mutator functions for the padding dimensions of the Convolution Layer.
  2. Add accessor function for obtaining the input and output dimensions of the LinearNoBias layer.

Copy link
Member

@zoq zoq left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me.

@@ -205,7 +205,7 @@ class AtrousConvolution
//! Modify the output height.
size_t& OutputHeight() { return outputHeight; }

//! Get the input size
//! Get the input size.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice catch!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Haha, was pointed out by @walragatver in a previous discussion.

@sreenikSS
Copy link
Contributor Author

@zoq thanks for approving.

I'd be happy if I can get an opinion regarding something I've been thinking about. I might need a few other changes in the mlpack code to make it fully compatible for the model converter.

In that case I would have to open another PR (and there would be multiple closed PRs doing the same thing). Instead, would you suggest I make the changes in my branch and make install that branch in some folder of my local machine and put my include path to that folder when I work on the model converter, and then at the end of work (when no more changes are required) push the changes to this one PR?

To summarize, is opening multiple PRs doing small parts of one big chunk of work considered good practice?

Copy link

@mlpack-bot mlpack-bot bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Second approval provided automatically after 24 hours. 👍

@rcurtin
Copy link
Member

rcurtin commented Dec 24, 2019

To summarize, is opening multiple PRs doing small parts of one big chunk of work considered good practice?

Up to you---it makes it easier to review and get it merged in, but it's also a little more overhead for you. Personally I'm fine with either approach; not sure if others have different opinions.

Copy link
Member

@rcurtin rcurtin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me also. 👍

@rcurtin rcurtin merged commit fa8463f into mlpack:master Dec 24, 2019
@zoq
Copy link
Member

zoq commented Dec 24, 2019

Agreed, it makes the review process easier, if you are going for multiple PR's it would be great if you could layout the things you like to change, so that we can get an overview.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants