New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding Copy Ctor to ANN #519
Conversation
Ooh this is a really neat idea, this totally eliminates the function parameter. This would also make the serialization easier, the only drawback I see so far is that any layer that currently holds an optimizer has to implement three more functions. In #513 @stereomatchingkiss suggested a switch from a reference to pointers to hold the function to be optimized. Primarily, to support serialization for the existing layer. Since @stereomatchingkiss worked on the problem I would like to hear his opinion about this. |
This idea looks pretty cool for me too, it makes the codes become more intuitive. About c++14 feature, could we postpone the use of them? Because there are a lot of c++14 features do not support by vc2013. Although vc2015 out, but cuda do not support vc2015 yet. |
Okay, great so, I think we should merge this code first and then head over to #516. We have to slightly modify #516, I can do that if no one likes.
The test should build using a C++11 compiler because of the conditional-compilation directives Joseph used. But I agree, we should avoid C++14 features for the moment, we can easily rewrite the test. I can do that once the code is merged. |
The reason for using C++14 is auto detection of return value in unit tests. On Fri, Feb 12, 2016 at 5:12 AM, Marcus Edel notifications@github.com
Joseph Chakravarti Mariadassou |
I agree with @stereomatchingkiss, we should hold off on C++14 support for a few years (probably) because of lacking compiler support and stability. The decltype solution is maybe a bit ugly though. Maybe there is a better option? (I'm not sure, I haven't thought about it much.) |
We can explicitly define the network type since we are using a small network for the test this should be fairly easy. I can do that once we merge this pull request in. It looks like the only modification left is the update of the rnn structure. |
In 69f97e5, f6c27ed and 43f0f13 we removed the optimizer from each layer in favor of a single optimizer in the base class (FFN, CNN, RNN). Not only makes this changes the layer easier, we can also use the existing mlpack optimizer for the neural network code. Since this pull request works with the "one optimizer for each layer" it didn't completely apply anymore. However, I would like to merge the copy Ctor test, so maybe you can open another pull request? If not I'll go and extract the changes from the pull request. Please comment if something doesn't make sense and I'll reopen it. |
All the original unit tests ran to completion. Added a few more to test copy constructor for FFN (feed forward) and CNN (convolutional).