Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Valid and Same padding option in convolution layers. #1988

Merged
merged 11 commits into from Nov 14, 2019

Conversation

@walragatver
Copy link
Contributor

walragatver commented Aug 26, 2019

Hi,
I have modified convolutional layer. But I have a doubt. Should we take all four padding dimension's as an argument in convolution.hpp. Currently we only take two dimension as argument in constructor. I have kept it as it is for now. I think taking all four padding dimension's would be good but it may break backward compatibility. Let me know what you think.

@@ -74,7 +75,8 @@ class Convolution
const size_t padW = 0,
const size_t padH = 0,
const size_t inputWidth = 0,
const size_t inputHeight = 0);
const size_t inputHeight = 0,
const std::string paddingType = "None");

This comment has been minimized.

Copy link
@zoq

zoq Aug 26, 2019

Member

I'm wondering if implementing another template argument (policy) would be more consistent with the existing mlpack code, let me know what you think. I don't have a preference here.

This comment has been minimized.

Copy link
@walragatver

walragatver Aug 28, 2019

Author Contributor

It looks like we cannot use string literals as template arguments. Should I create new classes for policies like we did for GAN to have StandardGan and DCGAN as class?

This comment has been minimized.

Copy link
@walragatver

walragatver Aug 29, 2019

Author Contributor

Currently I am keeping it same. It looks like I will need to create classes for it as a template argument. Let me know if it's incorrect. I will change it.

This comment has been minimized.

Copy link
@zoq

zoq Aug 30, 2019

Member

I think a string is fine, even closer to tf/keras.

{
inputPaddedTemp.set_size(inputTemp.n_rows + padW * 2,
inputTemp.n_cols + padH * 2, inputTemp.n_slices);
inputPaddedTemp.set_size(inputTemp.n_rows + padWLeft + padWRight,

This comment has been minimized.

Copy link
@zoq

zoq Aug 26, 2019

Member

Would be great if we could add a test as well.

This comment has been minimized.

Copy link
@walragatver

walragatver Aug 31, 2019

Author Contributor

Actually I am doing this for Inception Layer, so I think we will know whether it is wrong or not. Logically It seems correct. Let me know what you think.

This comment has been minimized.

Copy link
@zoq

zoq Sep 5, 2019

Member

Sorry for the slow response, ideally we can write an independent test for the adjusted conv operation, similar to

BOOST_AUTO_TEST_CASE(ValidConvolution2DTest)
.

@zoq zoq removed the s: unanswered label Aug 26, 2019
@zoq

This comment has been minimized.

Copy link
Member

zoq commented Aug 26, 2019

We could implement another constructor that takes the other dimensions as well, that way we could be backward compatible.

@walragatver

This comment has been minimized.

Copy link
Contributor Author

walragatver commented Sep 24, 2019

Hi @zoq, I will add test soon. Sorry for the delay.

@walragatver walragatver force-pushed the walragatver:valid_same branch from cc7e7bc to 9c83152 Sep 26, 2019
@walragatver

This comment has been minimized.

Copy link
Contributor Author

walragatver commented Sep 26, 2019

Hi @zoq,
I have added the test. I am thinking to work on transposed convolution in a separate PR. I have added tests in ann_layer_test.cpp file. Let me know if it's not the correct position.

@zoq

This comment has been minimized.

Copy link
Member

zoq commented Sep 26, 2019

Okay, will take a look, thanks.

@walragatver walragatver requested a review from zoq Oct 1, 2019
@walragatver

This comment has been minimized.

Copy link
Contributor Author

walragatver commented Oct 4, 2019

Also,
Can I add this to the next milestone as @sreenikSS, pointed out its a necessity. I think only a review is required for this as well.

@rcurtin rcurtin added this to the mlpack 3.2.2 milestone Oct 5, 2019
@ShikharJ ShikharJ added this to To Review in Shikhar's Board Oct 13, 2019
@zoq
zoq approved these changes Oct 29, 2019
Copy link
Member

zoq left a comment

Sorry for the slow response, the implementation looks good to me, we could add one or two more tests but if you like to test this in combination with the Inception layer that is fine for me as well.

>::InitializeSamePadding()
{
/*
* Using O = (W - F + 2P) / s + 1;

This comment has been minimized.

Copy link
@zoq

zoq Oct 29, 2019

Member

Great, that is really helpful.

input = arma::linspace<arma::colvec>(0, 48, 49);
module1.Parameters() = arma::mat(9 + 1, 1, arma::fill::zeros);
module1.Reset();
module1.Forward(std::move(input), std::move(output));

This comment has been minimized.

Copy link
@zoq

zoq Oct 29, 2019

Member

Would be a good idea to check the output as well, but if you are going to test this is combination with the Inception layer this is fine as well.

This comment has been minimized.

Copy link
@walragatver

walragatver Oct 30, 2019

Author Contributor

Yes, you are correct. I am now checking the sum of all values of output as zero. Let me know if anything else is needed to be added.

@mlpack-bot
mlpack-bot bot approved these changes Oct 30, 2019
Copy link

mlpack-bot bot left a comment

Second approval provided automatically after 24 hours. 👍

@walragatver walragatver force-pushed the walragatver:valid_same branch from 9c83152 to f9cfaa1 Oct 30, 2019
@walragatver walragatver force-pushed the walragatver:valid_same branch from f9cfaa1 to c9df628 Nov 13, 2019
@walragatver

This comment has been minimized.

Copy link
Contributor Author

walragatver commented Nov 13, 2019

Hi @zoq,
It looks like the build is failing because of this. The build was passing in my machine as I was using C++14. What should we do now? Can we upgrade our C++ version while building? I am not sure what I am saying let me know what you think.
https://stackoverflow.com/questions/32084706/why-initialization-doesnt-work-for-tuple

@rcurtin

This comment has been minimized.

Copy link
Member

rcurtin commented Nov 14, 2019

@walragatver it's a little bit uglier, but you could do std::tuple<size_t, size_t>(1, 1) and that should work. 👍 Given how old some of the compilers are for still-in-use distros like RHEL6, I don't think that we should change to require C++14 quite yet.

@walragatver walragatver force-pushed the walragatver:valid_same branch from c9df628 to 12db67f Nov 14, 2019
walragatver added 2 commits Nov 14, 2019
@walragatver walragatver merged commit d041945 into mlpack:master Nov 14, 2019
5 of 6 checks passed
5 of 6 checks passed
continuous-integration/appveyor/pr AppVeyor build failed
Details
LaTeX Documentation Checks Build finished.
Details
Memory Checks Build finished.
Details
Static Code Analysis Checks Build finished.
Details
Style Checks Build finished.
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
@zoq zoq removed the s: needs review label Nov 14, 2019
@walragatver walragatver deleted the walragatver:valid_same branch Nov 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
3 participants
You can’t perform that action at this time.