Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Swap boost::variant with vtable. #2777

Merged
merged 344 commits into from
Apr 29, 2022
Merged

Swap boost::variant with vtable. #2777

merged 344 commits into from
Apr 29, 2022

Conversation

zoq
Copy link
Member

@zoq zoq commented Dec 21, 2020

I updated the abstract class and also update the Linear layer as an example, there are various layer we have to update, so if anybody likes to work on some of the layers I listed below, comment on the PR. Unfortunately I can't enable commit permission to a specific branch. So to get the changes in you you can just fork the repository as usual create a new feature branch and do the changes, but instead of opening another PR, just post the link to the branch here and I cherry-pick the commit.

Steps:

  1. Inherit the Layer class, each layer should implement the necessary functions that are relevant for the layer-specific computations and inherent the rest from the base class.
  2. Rename InputDataType to InputType and OutputDataType to OutputType, to make the interface more consistent with the rest of the codebase, rename the type for the input and output data.
  3. Use InputType and OutputType instead of arma::mat or arma::Mat<eT>, to make the layer work with the abstract class we have to follow the interface accordingly.
  4. Provide default layer type to hide some of the template functionalities that could be confusing for users that aren’t familiar with templates. So instead of using Linear<> all the time, a user can just use Linear. This is a result of "The future of mlpack", round two! #2524 (comment).
  5. Update the tests to use the updated interface.

Example: For an example checkout the Linear layer.

Here is a list of layers we have to update:

I left the base layer since I'm not sure yet if it makes sense to implement them as an independent class.


Building upon the work from @Aakash-kaushik we can get a first impression of the advantage of using boost::visitor compared with a virtual inheritance approach (#2647)

Note we stripped basically everything out, except the FNN class, linear layer, FlexibleReLU layer, LogSoftMax layer; which enables us to get a first impression about what timings we can expect from a virtual inheritance approach.

I tested two scenarios, but used the same network for each:

FFN<> model;
model.Add<Linear<> >(trainData.n_rows, 128);
model.Add<FlexibleReLU<> >();
model.Add<Linear<> >(128, 256);
model.Add<Linear<> >(256, 256);
model.Add<Linear<> >(256, 256);
model.Add<Linear<> >(256, 256);
model.Add<Linear<> >(256, 512);
model.Add<Linear<> >(512, 2048);
model.Add<Linear<> >(2048, 512);
model.Add<Linear<> >(512, 8);
model.Add<Linear<> >(8, 3);
model.Add<LogSoftMax<> >();

Scenario - 1

batch-size: 1
iterations: 10000
trials: 10

vtable - DEBUG=ON

mlpack version: mlpack git-aa6d2b1aa
armadillo version: 9.200.7 (Carpe Noctem)
Filters: FFVanillaNetworkTest
elapsed time: 494.485s
elapsed time: 503.777s
elapsed time: 496.802s
elapsed time: 499.928s
elapsed time: 502.504s
elapsed time: 495.735s
elapsed time: 495.745s
elapsed time: 505.284s
elapsed time: 495.32s
elapsed time: 495.209s
--------------------------------------
elapsed time averaged(10): 498.479s

boost::variant - DEBUG=ON

mlpack version: mlpack git-4d01fe5e9
armadillo version: 9.200.7 (Carpe Noctem)
Filters: FFVanillaNetworkTest
elapsed time: 496.419s
elapsed time: 495.27s
elapsed time: 494.769s
elapsed time: 494.922s
elapsed time: 497.729s
elapsed time: 497.464s
elapsed time: 498.024s
elapsed time: 501.722s
elapsed time: 500.59s
elapsed time: 497.925s
--------------------------------------                                                                                                                                                                                                                                                                                       
elapsed time averaged (10): 497.483s   

vtable - DEBUG=OFF

mlpack version: mlpack git-aa6d2b1aa
armadillo version: 9.200.7 (Carpe Noctem)
Filters: FFVanillaNetworkTest
elapsed time: 199.713s
elapsed time: 205.177s
elapsed time: 200.135s
elapsed time: 200.179s
elapsed time: 205.792s
elapsed time: 198.293s
elapsed time: 198.535s
elapsed time: 206.635s
elapsed time: 198.263s
elapsed time: 198.521s
--------------------------------------
elapsed time averaged(10): 201.124s

boost::variant - DEBUG=OFF

mlpack version: mlpack git-4d01fe5e9
armadillo version: 9.200.7 (Carpe Noctem)
Filters: FFVanillaNetworkTest
elapsed time: 198.645s
elapsed time: 194.854s
elapsed time: 194.748s
elapsed time: 194.983s
elapsed time: 197.42s
elapsed time: 196.864s
elapsed time: 197.454s
elapsed time: 204.318s
elapsed time: 201.076s
elapsed time: 200.549s
--------------------------------------
elapsed time averaged (10): 198.091s

Scenario - 2

batch-size: 32
iterations: 10000
trials: 10

vtable - DEBUG=ON

mlpack version: mlpack git-aa6d2b1aa
armadillo version: 9.200.7 (Carpe Noctem)
Filters: FFVanillaNetworkTest
elapsed time: 70.4116s
elapsed time: 70.5631s
elapsed time: 70.682s
elapsed time: 70.5635s
elapsed time: 71.2245s
elapsed time: 71.1649s
elapsed time: 71.4714s
elapsed time: 71.2688s
elapsed time: 71.3348s
elapsed time: 71.3406s
--------------------------------------
elapsed time averaged(10): 71.0025s

boost::variant - DEBUG=ON

mlpack version: mlpack git-4d01fe5e9
armadillo version: 9.200.7 (Carpe Noctem)
Filters: FFVanillaNetworkTest
elapsed time: 70.3247s
elapsed time: 70.5059s
elapsed time: 70.5368s
elapsed time: 70.5208s
elapsed time: 70.4539s
elapsed time: 70.788s
elapsed time: 70.7692s
elapsed time: 70.9473s
elapsed time: 70.9146s
elapsed time: 70.7278s
--------------------------------------
elapsed time averaged (10): 70.6489s

vtable - DEBUG=OFF

mlpack version: mlpack git-aa6d2b1aa
armadillo version: 9.200.7 (Carpe Noctem)
Filters: FFVanillaNetworkTest
elapsed time: 59.7968s
elapsed time: 59.4626s
elapsed time: 59.9147s
elapsed time: 59.9682s
elapsed time: 60.5511s
elapsed time: 60.2109s
elapsed time: 60.7782s
elapsed time: 60.4981s
elapsed time: 60.719s
elapsed time: 60.7632s
--------------------------------------
elapsed time averaged(10): 60.2663s

boost::variant - DEBUG=OFF

mlpack version: mlpack git-4d01fe5e9
armadillo version: 9.200.7 (Carpe Noctem)
Filters: FFVanillaNetworkTest
elapsed time: 60.8466s
elapsed time: 61.0629s
elapsed time: 61.1269s
elapsed time: 60.7426s
elapsed time: 60.8178s
elapsed time: 60.7287s
elapsed time: 60.864s
elapsed time: 60.8982s
elapsed time: 60.9232s
elapsed time: 60.8519s
--------------------------------------
elapsed time averaged (10): 60.8863s

Looking at the timings, boost::variant doesn't provide the speedup I thought it would, on top of that the little speedup we would gain with boost::variant is marginal in comparison to the actual calculation.

@zoq zoq changed the title Swap boost::variant with vtable. [WIP] Swap boost::variant with vtable. Dec 21, 2020
@shrit
Copy link
Member

shrit commented Dec 22, 2020

Totally agreed, there is no difference in speed at all in some cases, the mean values are very similar.
We can say that boost::variant and vtables are the same, in addition, we are losing a lot of compilation time with boost that will never happen in vtables.
As a result, boost is adding more disadvantages and reducing the speed if we count the compilation time with the running time.
Thank you @zoq for the comparison 👍

@zoq
Copy link
Member Author

zoq commented Dec 22, 2020

Totally agreed, there is no difference in speed at all in some cases, the mean values are very similar.
We can say that boost::variant and vtables are the same, in addition, we are losing a lot of compilation time with boost that will never happen in vtables.
As a result, boost is adding more disadvantages and reducing the speed if we count the compilation time with the running time.
Thank you @zoq for the comparison 👍

Agreed, don't think it makes sense to stick with boost::variant at this point, the number will slightly change once the vtable gets bigger but I don't think the effect will be huge.

From my side, I can clean-up the interface first so it's easier to collaborate on the PR to incorporate the other layers as well. Unless someone has a strong opinion against the current approach.

@Aakash-kaushik
Copy link
Contributor

Hey this is more of a doubt but can't the Layer class be made abstract by declaring those functions purely virtual which don't actually return anything, that way no object of base class could be initialized and the function which do return something will still act as the base case.

Specifics: talking about layer.hpp and functions like Forward, Backward, Gradient and Reset.

@rcurtin
Copy link
Member

rcurtin commented Dec 22, 2020

Fascinating results! Thanks @zoq and @Aakash-kaushik for making this simulation happen. I agree with the other conclusions here---boost::variant might provide a speedup (I am not sure if it is statistically significant), but even if it does, that speedup is outweighed by the massive slowdown that we think it causes at compile time. It will be a big effort, but I would support (and help) refactoring. I know that we use variant a few other places; specifically for the NSModel, RSModel, RAModel, FastMKSModel, and KDEModel classes. So that will need to be refactored too, and I can handle that part. (There's no need to do any timing simulations there because any function would be only called once or twice per run of a binding. I only used variant there to avoid inheritance, which is a decision I now regret because I unintentionally made more work for now. :))

@Aakash-kaushik
Copy link
Contributor

Also this looks like this might be a lot of work and I would like to help in the refactoring process so let me know if i can help somewhere.

@zoq
Copy link
Member Author

zoq commented Dec 22, 2020

Hey this is more of a doubt but can't the Layer class be made abstract by declaring those functions purely virtual which don't actually return anything, that way no object of base class could be initialized and the function which do return something will still act as the base case.

Specifics: talking about layer.hpp and functions like Forward, Backward, Gradient and Reset.

If we make the functions purely virtual, each layer has to implement the function; right now each layer implements the Forward and Backward method, but the loss layer uses a different interface -> const arma::mat&, const arma::mat& instead of const arma::mat&, arma::mat& so we would have to change the interface of the loss function layers. In addition not every layer implements the Gradient or Reset method (almost every activation layer), so the only function that I think could be purely virtual is the Backward function. So I'm not sure it makes sense to go with a purely virtual method class, what do you think?

@zoq
Copy link
Member Author

zoq commented Dec 22, 2020

Also this looks like this might be a lot of work and I would like to help in the refactoring process so let me know if i can help somewhere.

Absolutely, we do have a bunch of layers so any help would be great. My current plan is to clean-up the interface first and then I can add a list of layers that we have to modify. I will check if I can provide commit rights to my branch, so you could directly push changes.

@Aakash-kaushik
Copy link
Contributor

Aakash-kaushik commented Dec 22, 2020

Hey this is more of a doubt but can't the Layer class be made abstract by declaring those functions purely virtual which don't actually return anything, that way no object of base class could be initialized and the function which do return something will still act as the base case.
Specifics: talking about layer.hpp and functions like Forward, Backward, Gradient and Reset.

If we make the functions purely virtual, each layer has to implement the function; right now each layer implements the Forward and Backward method, but the loss layer uses a different interface -> const arma::mat&, const arma::mat& instead of const arma::mat&, arma::mat& so we would have to change the interface of the loss function layers. In addition not every layer implements the Gradient or Reset method (almost every activation layer), so the only function that I think could be purely virtual is the Backward function. So I'm not sure it makes sense to go with a purely virtual method class, what do you think?

So my main purpose to introduce a purely virtual function in the base class is to ensure that even by mistake a base class object is not created and if someone somehow does that the compiler would throw a very much understandable error, and even if we can just make a single function purely virtual which you suggested to be the Backward function i think that would make the class abstract and achieve the purpose. I am not sure if there are other ways to do this so your feedback would be really great on this.

@Aakash-kaushik
Copy link
Contributor

Also this looks like this might be a lot of work and I would like to help in the refactoring process so let me know if i can help somewhere.

Absolutely, we do have a bunch of layers so any help would be great. My current plan is to clean-up the interface first and then I can add a list of layers that we have to modify. I will check if I can provide commit rights to my branch, so you could directly push changes.

That would be really awesome. Do let me know when you do that.

@zoq
Copy link
Member Author

zoq commented Dec 23, 2020

I'll clean-up and add the remaining methods for the FFN class next.

@zoq
Copy link
Member Author

zoq commented Dec 23, 2020

So my main purpose to introduce a purely virtual function in the base class is to ensure that even by mistake a base class object is not created and if someone somehow does that the compiler would throw a very much understandable error, and even if we can just make a single function purely virtual which you suggested to be the Backward function i think that would make the class abstract and achieve the purpose. I am not sure if there are other ways to do this so your feedback would be really great on this.

Sounds reasonable to me, my suggestion would be to update all the layers first so we know what functions we need, and then we can check which method we can make pure-virtual, maybe we can add a warning or so if a method from the abstract is called.

@zoq zoq added this to the mlpack 4.0.0 milestone Dec 24, 2020
@Aakash-kaushik
Copy link
Contributor

So my main purpose to introduce a purely virtual function in the base class is to ensure that even by mistake a base class object is not created and if someone somehow does that the compiler would throw a very much understandable error, and even if we can just make a single function purely virtual which you suggested to be the Backward function i think that would make the class abstract and achieve the purpose. I am not sure if there are other ways to do this so your feedback would be really great on this.

Sounds reasonable to me, my suggestion would be to update all the layers first so we know what functions we need, and then we can check which method we can make pure-virtual, maybe we can add a warning or so if a method from the abstract is called.

Sure that seems good to me, also for now i will take up adaptive_max_pooling.hpp and adaptive_mean_pooling.hpp.

@zoq
Copy link
Member Author

zoq commented Dec 24, 2020

So my main purpose to introduce a purely virtual function in the base class is to ensure that even by mistake a base class object is not created and if someone somehow does that the compiler would throw a very much understandable error, and even if we can just make a single function purely virtual which you suggested to be the Backward function i think that would make the class abstract and achieve the purpose. I am not sure if there are other ways to do this so your feedback would be really great on this.

Sounds reasonable to me, my suggestion would be to update all the layers first so we know what functions we need, and then we can check which method we can make pure-virtual, maybe we can add a warning or so if a method from the abstract is called.

Sure that seems good to me, also for now i will take up adaptive_max_pooling.hpp and adaptive_mean_pooling.hpp.

Awesome, updated the issue.


//! Locally-stored output parameter object.
OutputDataType outputParameter;

//! Locally-stored mast object.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Written mast instead of mask.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice catch.

@Aakash-kaushik
Copy link
Contributor

Aakash-kaushik commented Dec 25, 2020

Hey made couple of other changes and also updated the two layers that i took, Can you take a look here.

Next I will be taking:

  1. max_pooling.hpp
  2. mean_pooling.hpp
  3. add.hpp
  4. add_merge.hpp
  5. alpha_dropout.hpp
  6. atrous_convolution.hpp
  7. batch_norm.hpp

@zoq
Copy link
Member Author

zoq commented Dec 26, 2020

Hey made couple of other changes and also updated the two layers that i took, Can you take a look here.

Next I will be taking:

1. max_pooling.hpp

2. mean_pooling.hpp

3. add.hpp

4. add_merge.hpp

5. alpha_dropout.hpp

6. atrous_convolution.hpp

7. batch_norm.hpp

Updated the issue, and cherry-picked the commits from your repo.

@Aakash-kaushik
Copy link
Contributor

Hey made couple of other changes and also updated the two layers that i took, Can you take a look here.
Next I will be taking:

1. max_pooling.hpp

2. mean_pooling.hpp

3. add.hpp

4. add_merge.hpp

5. alpha_dropout.hpp

6. atrous_convolution.hpp

7. batch_norm.hpp

Updated the issue, and cherry-picked the commits from your repo.

Hi, Thank you so much, Also i made some specific changes for me such as turning bindings off in cmake and turning profile and debug on by default, I think we can take a look at them at last.

@zoq
Copy link
Member Author

zoq commented Apr 28, 2022

I think I can, just wanted to wait for the merge conflict to get resolved first.

@zoq
Copy link
Member Author

zoq commented Apr 28, 2022

Hm, for some reason the memory job is not working, but I can see we use -DDOWNLOAD_DEPENDENCIES=ON, I'll look into it.

@zoq
Copy link
Member Author

zoq commented Apr 28, 2022

I think I can, just wanted to wait for the merge conflict to get resolved first.

Nevermind, you are right, I can't. But yeah, phenomenal work on this one, super excited to have this one merged.

@rcurtin
Copy link
Member

rcurtin commented Apr 28, 2022

Thanks! Now the real fun begins 😄 (but hopefully with more reasonable diffs!)

Copy link
Member

@rcurtin rcurtin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe mlpack-bot needs an approval with a comment?

Copy link

@mlpack-bot mlpack-bot bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Second approval provided automatically after 24 hours. 👍

@rcurtin rcurtin merged commit f7cd038 into mlpack:master Apr 29, 2022
@rcurtin
Copy link
Member

rcurtin commented Apr 29, 2022

Ok, in it goes! I'll start opening follow-up issues to be resolved tomorrow.

shubham1206agra added a commit to shubham1206agra/mlpack that referenced this pull request May 10, 2022
commit 54c6ebe03a07d7c32db46a6a06a03e8b821da4f2
Merge: 775a3b55f b406fc150
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun May 1 13:24:24 2022 -0400

    Merge pull request #3200 from shubham1206agra/go-cli-fix

    Go Build Fix

commit 775a3b55f73eb595c03baf78e6901c9e21a59aaa
Merge: 8e72ed698 54e291443
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 30 10:41:45 2022 -0400

    Merge pull request #3198 from shubham1206agra/py-cli-fix

    Python Build Fix

commit 8e72ed6986b6898b8b452682aaab80694e22e61b
Merge: f7cd03866 1c1182301
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 30 10:40:59 2022 -0400

    Merge pull request #3199 from shubham1206agra/r-cli-fix

    R Build Fix

commit b406fc15069054fee73263c450da7cedf3ed0d2c
Author: shubham1206agra <tt1191044@iitd.ac.in>
Date:   Fri Apr 29 10:42:32 2022 +0530

    changes according to suggestion

commit 54e2914430ed4862ecdbde557790e828d9e6a7af
Author: Shubham Agrawal <58412969+shubham1206agra@users.noreply.github.com>
Date:   Fri Apr 29 10:20:14 2022 +0530

    Update src/mlpack/bindings/python/copy_artifacts.py

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit f7cd03866077b7813c5a2ba352cc86aab28e7806
Merge: 065fcee29 3eb8ae67e
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Apr 28 19:00:58 2022 -0700

    Merge pull request #2777 from zoq/ann-vtable

    Swap boost::variant with vtable.

commit ad4569213a5d96625e0ced566451a8d64ea0c2bd
Author: shubham1206agra <tt1191044@iitd.ac.in>
Date:   Thu Apr 28 19:32:14 2022 +0530

    temp sol to version issue

commit 1bfa385663f1f6f97cb8a17591b7f97a8c0b8829
Author: shubham1206agra <tt1191044@iitd.ac.in>
Date:   Thu Apr 28 18:48:03 2022 +0530

    initial fix by disabling go modules

commit 1c1182301e0e44421bd1588ae2e3d038bd34e25e
Author: shubham1206agra <tt1191044@iitd.ac.in>
Date:   Thu Apr 28 12:46:18 2022 +0530

    force install pkgbuild

commit 3cfbb4e65d9822df8a25c5625d189ac734665c7b
Author: shubham1206agra <tt1191044@iitd.ac.in>
Date:   Thu Apr 28 12:36:51 2022 +0530

    cleanup

commit feab906927dca5a88a19ec00dea281263e4be35d
Author: shubham1206agra <tt1191044@iitd.ac.in>
Date:   Thu Apr 28 11:28:32 2022 +0530

    missing '/' added

commit 577cc2d6929d6b22d671f314a49756815d99b4c1
Author: shubham1206agra <tt1191044@iitd.ac.in>
Date:   Thu Apr 28 10:33:02 2022 +0530

    new directory structure using glob

commit 3eb8ae67eceeb1c1ad226687641b4748a573e3df
Merge: 6f98ab7bc 065fcee29
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Apr 27 20:51:46 2022 -0400

    Merge remote-tracking branch 'origin/master' into ann-vtable

commit 6f98ab7bc9749b04565c512f7eb59c406beb7826
Author: Eshaan Agarwal <eshaan060202@gmail.com>
Date:   Wed Apr 13 20:13:06 2022 +0530

    Fix style issues

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit ac3097e875f42c95650634530026acde77dace63
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Wed Apr 13 18:22:03 2022 +0530

    fix error in size_t cast

    Signed-off-by: eshaanagarwal <eshaan060202@gmail.com>

commit d03aafacf618977ed9de4ef2a2ad70e0394e4daf
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Tue Apr 5 13:02:00 2022 +0530

    add parameter documentation in size checks

commit e55609d5beeb626ec4d7cfe0f95e16059b050779
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Tue Apr 5 00:51:27 2022 +0530

    Add transpose parameter in size check

commit a485da2b715bdd67e0ec7848f845be116db16a05
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Thu Mar 31 04:22:53 2022 +0530

    fixed issues in styling

commit dff01492ed44f117ab1d7f0be518547da280dd44
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Mon Mar 14 22:25:27 2022 +0530

    fixed styling issues

commit ca50361083791425908aea6606902ca7230413eb
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Fri Mar 11 22:50:03 2022 +0530

    fix build issue by removing row vector assert condition

commit 5b7d36db8586bdbf7e70b623906c86759867750a
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Fri Mar 11 20:15:07 2022 +0530

    fix failed build

commit 614f924a4ef0c67894dff5b1f41e37fda55c03e9
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Fri Mar 11 19:50:54 2022 +0530

    fixed redundancy in size-checks

commit f67c566a5ffa21afb60d460fe813c363da67924a
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Fri Mar 11 19:30:22 2022 +0530

    fix matrix completion size-checks

commit e1e2b5308b4dbeaddb8838e20658b3bf63a66165
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Fri Mar 11 12:19:37 2022 +0530

    remove incorrect checks in adaboost

commit e5d701df32f02289b9aea2209b5b1944bee9a2c8
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Thu Mar 10 16:30:49 2022 +0530

    fix size checks

commit 2656b300f9ef1984d2a2b58add8ce4b8004908aa
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Thu Mar 10 12:24:15 2022 +0530

    fix styling issue

    Signed-off-by: eshaanagarwal <eshaan060202@gmail.com>

commit ef4d293fedd338f847483a2246462ec7a8ed62cb
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Wed Mar 9 02:33:08 2022 +0530

    Add: size checks for kmeans and linear regression

commit 640dd0cde815aa24a0ddf2f0ff7cccf6eec5c8f0
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Tue Mar 8 02:27:38 2022 +0530

    Add : Size checks for adaboost and matix completion

    Signed-off-by: eshaanagarwal <eshaan060202@gmail.com>

commit 66bc9cbe008ded17638cec58c186ac0487980007
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 15 21:29:30 2022 -0400

    Huh, I guess it is a new year.

commit 7b0c1e157bd413f1e58c40e7ad2f982d5561bba7
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 15 21:29:15 2022 -0400

    Update HISTORY.

commit e4be17defe01e980f7ffe1859020fee021de7641
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 15 21:26:42 2022 -0400

    Add test for KFoldCV and Perceptron.

commit a70437ffc30627cf69a981ae10618cd61fccc287
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 15 21:26:28 2022 -0400

    Add constructor to Perceptron for weighted data for KFoldCV.

commit 9b86271002ba1c5ab2cc99694e83d54c369a2b07
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 15 21:26:08 2022 -0400

    Make Classify() set the output predictions' size.

commit a35fc3b994de138b99b82453e98be84afedfd5a8
Author: Yashwants19 <Yashwants19@users.noreply.github.com>
Date:   Sun Apr 17 10:02:24 2022 +0000

    Upgrade Catch to 2.13.9

commit a1bb763729f00475a3904db4d798c2a81767c19a
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Apr 16 21:40:15 2022 +0200

    Update src/mlpack/core/data/save_image.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 57cec6f081c16be729d9a35e302847bd5d18e743
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Apr 16 16:20:14 2022 +0100

    Apply @rcurtin modification to check STB version

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 8e34862cf62c5c610d98dd1ac9768307e11830b2
Author: Omar Shrit <omar@shrit.me>
Date:   Sun Mar 13 21:28:32 2022 +0000

    Let us see inital try for STB test

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit afcc862ced00728ec0a27251f07a51e1cd08adb6
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Feb 3 21:46:40 2022 +0000

    Adding the missing starts, shitty regexp

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 0c076cca19e7e03e5aacd3cf199cec349f66407d
Author: Omar Shrit <omar@shrit.me>
Date:   Tue Jan 25 12:09:08 2022 +0000

    Finish this PR

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit b26f2d2a15d652427dfd6ef1542343ecbfe6f620
Author: Omar Shrit <omar@shrit.me>
Date:   Mon Jan 24 22:15:55 2022 +0000

    Refactor save_image into save_image_impl finally done!!

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 8fffbd55fa5b92425a3e202c3bf7faeadcb69d0f
Author: Omar Shrit <omar@shrit.me>
Date:   Mon Jan 24 21:20:12 2022 +0000

    Adjust namespace of mlpack::Log

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 37367c7ec5514ffd423317ddd5ddeca250761680
Author: Omar Shrit <omar@shrit.me>
Date:   Mon Jan 24 19:51:55 2022 +0100

    Update src/mlpack/core/data/image_info_impl.hpp

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit 3cc2d250b588aed384881ae01d0ef36aaf57d07a
Author: Omar Shrit <omar@shrit.me>
Date:   Mon Jan 24 19:51:49 2022 +0100

    Update src/mlpack/core/data/image_info_impl.hpp

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit 55c6cfd7df62646e29bd73102af533b127738586
Author: Omar Shrit <omar@shrit.me>
Date:   Sun Jan 23 19:38:29 2022 +0000

    Apply @rcurtin patch to fix the STB issue.

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 8cdb53a313af3b8aaddc891ebf514445d7ec58ee
Author: Omar Shrit <omar@shrit.me>
Date:   Sun Jan 16 18:20:04 2022 +0000

    Commenting all #defines that are causing the problems

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit e2e000b06f8c08937aac345700f0e0927b4e59a1
Author: Omar Shrit <omar@shrit.me>
Date:   Sun Jan 16 16:21:27 2022 +0000

    Move constructor to implementation

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 981f57c4be46a855eaef81a38ac513ead28931ce
Author: Omar Shrit <omar@shrit.me>
Date:   Sun Jan 16 16:03:30 2022 +0000

    Compiling locally, adding all modifications

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 4d29f5f9c1a85e2808bf0526e3386e665a184c64
Author: Omar Shrit <omar@shrit.me>
Date:   Sun Jan 16 12:59:31 2022 +0000

    Provide more details for namespace in src/mlpack/methods/det/dtree_impl.hpp

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit e8a76f8f354eb03d2a8b8658dbc5b756baf0f941
Author: Omar Shrit <omar@shrit.me>
Date:   Sun Jan 16 12:59:03 2022 +0000

    Remove mlpack namspace from src/mlpack/methods/hmm/hmm_util_impl.hpp

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit 9dd15d08655e0c65755912725652271c82f4b182
Author: Omar Shrit <omar@shrit.me>
Date:   Sun Jan 16 12:58:50 2022 +0000

    remove mlpack namespace from src/mlpack/methods/reinforcement_learning/q_networks/categorical_dqn.hpp

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit aa057ccb8b519c110a1c0a43a9576da65df9b266
Author: Omar Shrit <omar@shrit.me>
Date:   Sun Jan 16 12:58:38 2022 +0000

    Remove mlpack namespace from src/mlpack/methods/hmm/hmm_util_impl.hpp

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit 61cc69b1a5db9e39d9f87af984166e0ec7961ead
Author: Omar Shrit <omar@shrit.me>
Date:   Sun Jan 16 12:58:22 2022 +0000

    Remove mlpack namespace from src/mlpack/methods/reinforcement_learning/q_networks/dueling_dqn.hpp

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit 31f9545685df3492b1bdfafee70a226c29e082af
Author: Omar Shrit <omar@shrit.me>
Date:   Sun Jan 16 12:58:06 2022 +0000

    Remove mlpack namespace from src/mlpack/methods/reinforcement_learning/q_networks/simple_dqn.hpp

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit 665711b9cee4f3ea4186ae4c71230d18c8f2c872
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 16 15:33:51 2021 +0100

    Remove additional line in src/mlpack/core/kernels/pspectrum_string_kernel.hpp

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit 895c885b44fbcca2f07353ec0284d506688759fb
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 16 15:33:08 2021 +0100

    Fix indentation in src/mlpack/core/data/detect_file_type.hpp

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit cf6f60aee899e924718265756a2f46cf4c5b6412
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 16 15:32:55 2021 +0100

    Add forgetten dot in src/mlpack/core/math/random_basis.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 21e045928a384ec9a02455eac66b5c6f49aec1ff
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 16 15:32:38 2021 +0100

    Add spaces in src/mlpack/core/math/lin_alg_impl.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 1f79e4fc8346d88db22895653c36f5c892bc5fe6
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 16 15:32:16 2021 +0100

    Remove additional line in src/mlpack/core/math/lin_alg_impl.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 6dc2d63af66cd3c13d75456c4c9f369ccadc2c30
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 16 15:32:00 2021 +0100

    Fix indentation src/mlpack/core/math/lin_alg_impl.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit ed07d9f05afbfa1bc9b5a145fca092cd9aba0933
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 16 15:31:41 2021 +0100

    Fix indentation in src/mlpack/core/math/lin_alg_impl.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 047dc47b3b44a819df0d4b430cf81ed3d20ecfa1
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 16 15:31:24 2021 +0100

    Add parantheses in src/mlpack/core/math/lin_alg_impl.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 2dae84947a81412a2f51efc333a4e29f1d063fa0
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 16 15:31:00 2021 +0100

    Fix style in src/mlpack/core/kernels/epanechnikov_kernel_impl.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 8666165293d5e5e5a54872954a52147ceb43da50
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 16 15:30:43 2021 +0100

    Update style in src/mlpack/core/kernels/epanechnikov_kernel_impl.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 6254b4b49a93c6e14b63d6943bfa198fb1a1135a
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 2 23:54:27 2021 +0000

    Add forgetten layer name

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 407b563330cd1f81020f85198f14d7a3a4e5cccd
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 2 22:54:06 2021 +0000

    Clean all using namespaces from the headers.

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 4a713818abe2a891f4118d67f70f3c7cb2462acd
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 2 19:40:02 2021 +0000

    Fix the random_basis_impl in addition

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 99a485b6be4cbe9662398ab23f507923cd041fe3
Author: Omar Shrit <omar@shrit.me>
Date:   Thu Dec 2 19:39:05 2021 +0000

    Clean partly the namespace. Let us see if this resolve the macOS issues

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 986bdb6bdd5b07aa5b9e209b498a89ecd386dc1b
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 20 22:03:31 2021 +0000

    Adjust namespace in preprocess

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit c3691bdf657281f2574e875feb751aaad1712ec5
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 20 20:33:55 2021 +0000

    Remove the mlpack:: namespace

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit adbedb857bed36229d6637aea0499a844afa455a
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 20 19:56:14 2021 +0000

    Adding missing math/random headers

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit d246f942863aed5899234857e49d4f9d444edbab
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 20 19:29:07 2021 +0000

    Fix the gmm error

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 10ed5c1c965cd09c83ebcf25c5ecf57fd6bcecce
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 20 17:06:11 2021 +0000

    Let us if this resolves the binding issues

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 8b0c8038dd4d45dd2efd3d28f515e8bcc674987c
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 19:52:19 2021 +0000

    Adding a missing std string

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit ff2f37c4e7a37121bf74d94b78a169c11c905a63
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 19:40:23 2021 +0000

    Finish the pspectrum_string_kernel

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit c6bf80178a260a63d0a0ddcb20a6d312f434e2af
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 19:39:32 2021 +0000

    Deleting epanechnikov_kernel and adding pspectrum_string_kernel

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 09f508c6de1641aa047eec5e584d7c09a7237a5d
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 19:33:05 2021 +0000

    Finishing the epanechnikov_kernel

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 61d01a39491d487196373a67358d0b6c7a519b40
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 19:23:57 2021 +0000

    remove epanechnikov_kernel

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit bd84bb2d0dd13728a20be5315e9a28ac46af607c
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 19:15:58 2021 +0000

    inline random_basis

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 90b80ea766b80849599ed7a18c0c87e2aaf1e3d2
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 19:11:32 2021 +0000

    Adding missing headers

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit e0c5ac34fbc43ba246eeb77a92dd08a528381906
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 18:56:00 2021 +0000

    move the impl from .cpp to .impl

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 85a53b486f45ecc9caf5f5f2e30cf8df402d6ebd
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 18:43:17 2021 +0000

    Fix the compilation warning related to include implementations

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit c1d3833ff155c289fddaf45195ee92ed971b4ad7
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 18:24:15 2021 +0000

    Fix comments and compilation bugs

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit b037f4f69fff9c40454859a245a743680ab0ff65
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 18:12:02 2021 +0000

    Finish inlining the data dir

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit 0941f650383a1e710152459d21f32a6abcf2d233
Author: Omar Shrit <omar@shrit.me>
Date:   Sat Nov 13 18:04:01 2021 +0000

    Make save image header only

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit ba84188163d891764a84551b5c2827708875d484
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Wed Mar 16 23:23:01 2022 -0400

    Prettify C++ code used to check for atomic linkage.

commit f9cc2f6bfa65454f49b82fa16c21c94eb7d085cb
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Tue Mar 15 22:14:12 2022 -0400

    If we use MSVC no need to check for atomic.

commit 119da780ed487cd10f6ab0872d2b88bae8edc169
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Sun Feb 20 21:15:50 2022 -0500

    Check if libatomic is bundled.

commit 96f4d9aea6c4b43d20c4302df650a4ac70767ea8
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Wed Feb 16 21:42:44 2022 -0500

    Check if atomics need -latomic linking.

commit 47f94fbaee9e67ed5871774ce0df953345b115c0
Author: zoq <zoq@users.noreply.github.com>
Date:   Thu Apr 14 10:05:15 2022 +0000

    Upgrade Boost Version in CMake script.

commit c31961bc46fb4fb0ac91221953620138168c2d8e
Author: LiuZhuojin <zhuojinliu.cs@gmail.com>
Date:   Thu Apr 14 09:53:55 2022 +0800

    Update contributor list

commit 2d77be64d9504d8d54058b43455935d1e7ce6fcc
Author: LiuZhuojin <zhuojinliu.cs@gmail.com>
Date:   Wed Apr 13 09:33:03 2022 +0800

    Replace boost::heap::priority_queue with std::vector

commit aa4a8b3922420a8cef248e4392b0a5d09af8a3e6
Author: LiuZhuojin <zhuojinliu.cs@gmail.com>
Date:   Wed Apr 13 00:40:08 2022 +0800

    Add vector and queue to standard includes

commit d9145891a7bee9147004cc92efe3756422662f32
Author: LiuZhuojin <zhuojinliu.cs@gmail.com>
Date:   Wed Apr 13 00:38:40 2022 +0800

    Replace boost::heap::priority_queue with std::vector

commit 58518ab811c144bb4e0f4edb5601c352df9bfba9
Author: LiuZhuojin <zhuojinliu.cs@gmail.com>
Date:   Wed Apr 13 00:27:28 2022 +0800

    Replace boost::heap::priority_queue with std::vector

commit bc30907f5a57a3652bd5778bb1604cf27482af63
Author: LiuZhuojin <zhuojinliu.cs@gmail.com>
Date:   Sat Apr 9 15:04:31 2022 +0800

    Replace boost::heap::priority_queue with std::vector

commit e117627c7f70f024576389fa372c1b1d6866c7aa
Author: LiuZhuojin <zhuojinliu.cs@gmail.com>
Date:   Sat Apr 9 14:58:28 2022 +0800

    Replace boost::heap::priority_queue with std::vector

commit 4d962ebab6242b117eac634d81bb446d538952f7
Author: LiuZhuojin <zhuojinliu.cs@gmail.com>
Date:   Fri Apr 8 13:53:10 2022 +0800

    Replace boost::heap::priority_queue with std::vector

commit 65afadc4afbcd8ac054f19de48ca6978d5855058
Author: Yashwants19 <Yashwants19@users.noreply.github.com>
Date:   Fri Apr 1 10:00:54 2022 +0000

    Upgrade CLI11 to 2.2.0

commit 5875d3625adeb460568d23d3ce9d29cb53ad0a3c
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Apr 27 16:10:15 2022 -0400

    Apply suggestions from code review

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit fef34ea8db04060479255608d2e226468d8b5b13
Author: shubham1206agra <tt1191044@iitd.ac.in>
Date:   Wed Apr 27 19:25:31 2022 +0530

    trying something else

commit 64e1f75b65f8710a1f293c473470ab0991107c69
Author: shubham1206agra <tt1191044@iitd.ac.in>
Date:   Wed Apr 27 19:11:02 2022 +0530

    trying config file

commit 065fcee296caa289af68d59b0eafec6658bc38ce
Merge: 56d3f1636 01fa1241c
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 18 15:43:03 2022 -0700

    Merge pull request #3164 from eshaanagarwal/size-checks

    Added Size checks for Matrix Completion, Kmeans and Linear Regression

commit 56d3f16368b62cbe174062d0445f524cebf0c54a
Merge: 2994570a0 152094dbb
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 18 06:01:10 2022 -0700

    Merge pull request #3190 from mlpack/fix-perceptron-cv

    Add weighted data constructor to `Perceptron`

commit 2994570a05997fa1607d5702196fdd48be266af7
Merge: fdc7af7a6 c47cebdd8
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 18 05:55:12 2022 -0700

    Merge pull request #3191 from mlpack/catch-header-updates-2.13.9

    Upgrade Catch to 2.13.9

commit c47cebdd822490b90f8aa88c0cfbb80a4c14c237
Author: Yashwants19 <Yashwants19@users.noreply.github.com>
Date:   Sun Apr 17 10:02:24 2022 +0000

    Upgrade Catch to 2.13.9

commit 1645bb22eba65ba22b3ea5d4ce5769397d41aa9d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 16 21:05:15 2022 -0400

    Use `typename` instead of `class` for consistency.

commit d8a1c27b7f9efd7844f0dec0519d2f16fb595615
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 16 20:55:50 2022 -0400

    Add serialization to GlorotInit.

commit 152094dbbea68d59ee3491ce0b60e9218896c93f
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 15 21:29:30 2022 -0400

    Huh, I guess it is a new year.

commit c17e9c2f0f0efbb3ff9729907d1ac3ca6809da06
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 15 21:29:15 2022 -0400

    Update HISTORY.

commit 9a6551a1b5aab9d4e82a52b861438901c085616d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 15 21:26:42 2022 -0400

    Add test for KFoldCV and Perceptron.

commit 3133a4abbded4be9b440cdb315c802bb17c60d07
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 15 21:26:28 2022 -0400

    Add constructor to Perceptron for weighted data for KFoldCV.

commit 80e65d8a5d35780dfba21690c891c1258c89265a
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 15 21:26:08 2022 -0400

    Make Classify() set the output predictions' size.

commit 01fa1241cee30b9e2cd47d61137dc733994a74af
Author: Eshaan Agarwal <eshaan060202@gmail.com>
Date:   Wed Apr 13 20:13:06 2022 +0530

    Fix style issues

    Co-authored-by: Ryan Curtin <ryan@ratml.org>

commit cb35eba204c328cda8864aa742f2d3ea6c6b294d
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Wed Apr 13 18:22:03 2022 +0530

    fix error in size_t cast

    Signed-off-by: eshaanagarwal <eshaan060202@gmail.com>

commit f12d714c50f124045173d9c5544a35aa5e052f02
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 11 13:34:43 2022 -0400

    Comment on the weightsPtr parameter.

commit 44b30019f4c32555311fd1a3d4f3213fa7e1cb8d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 11 13:32:39 2022 -0400

    Add another padding test for Convolution.

commit 1a9f7ccd5bfd82b1653d30e23cb95c9fd6fc84d0
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 11 11:56:39 2022 -0400

    Return correct type of Weights().

commit bacd18db556064061c8c68faa68e1e73cdfd0f31
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 11 11:53:54 2022 -0400

    Update src/mlpack/methods/ann/layer/concatenate_impl.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 3dfe5a000d8f774f0dc63936269e871875f4ddea
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 11 11:53:40 2022 -0400

    Update src/mlpack/methods/ann/layer/concatenate.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 6b47a9cd6752fa6b7ef4e4e319ebdc31e756b5b3
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 11 11:53:33 2022 -0400

    Update src/mlpack/methods/ann/layer/alpha_dropout.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 77203b84cf26d85a6cff79bc8af57f84a0014871
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 11 11:53:25 2022 -0400

    Update src/mlpack/methods/ann/layer/alpha_dropout.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 845a5a0aed34910f8d7ef147ca8a726218963a16
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Tue Apr 5 13:02:00 2022 +0530

    add parameter documentation in size checks

commit 07be7313a75d0c0676ec8cb7a845a51d2fbdc837
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 4 22:23:36 2022 -0400

    Use network.Forward() to avoid the extra output copy.

commit 0710d9f1113074fdb7d000aae3d1dcb82f325e27
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Apr 4 22:05:35 2022 -0400

    Update src/mlpack/methods/ann/ffn.hpp

    Co-authored-by: Marcus Edel <marcus.edel@fu-berlin.de>

commit 0e9f05adbc9ec75c5604050b6012ba690f4d9ba8
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Tue Apr 5 00:51:27 2022 +0530

    Add transpose parameter in size check

commit 2cd38b3c9e5ecdf9438cb4c88850efc236868b76
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 22:02:45 2022 -0400

    Some additional cleanups.

commit c38625f8d46db5b6adce8895e57e8103685ffc7f
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 22:01:13 2022 -0400

    Change 'rho' to 'bpttSteps' for clarity.

commit 6eecba768ae6abb6f2a37c78f5cca185c7b1a990
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 21:55:50 2022 -0400

    Include numeric header for std::accumulate.

commit 12287174e463226075f2a9eddd0cd3c7995bd3a1
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 21:29:51 2022 -0400

    Fix TODOs in LSTM layer.

commit 71ecddf8f644a675e0666d2f870e8b4ed82c3e61
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 20:49:03 2022 -0400

    Some cleanups of the RNN class.

commit 83a178dffa17c76d43df42ca7e1ba3e12c0a9213
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 16:53:25 2022 -0400

    Some additional type fixes.

commit f5b3885998501c0ba406c8cd851805c3c34bdbfd
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 16:30:02 2022 -0400

    Remove unused typedef.

commit eae99cfacaec22b7c743034664d96b12e56f269d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 16:27:29 2022 -0400

    Simplify Shuffle() implementation.

commit 250ab5cdd0400525cbfe9f5edb1724821ed6bcdb
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 16:24:42 2022 -0400

    Remove unused Swap() function declaration.

commit 14324209171a8fb919731199e78d93e27cfa3dea
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 16:23:28 2022 -0400

    Use MakeAlias() in the RNN implementation.

commit 03045b023569c2ca106a08bda97130754c905e92
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 16:08:59 2022 -0400

    Fix line wrap.

commit fe8a288613a19542f924e56c9d2a12cce9873a38
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 16:07:01 2022 -0400

    Minor fixes to tests.

commit 44a18de28cb71b07b3f042520eb49cdadcd7abed
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 15:58:36 2022 -0400

    Clarify RNN comment.

commit 10d04c4153b8cf25eec88e8e005b95b35e5ff128
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 15:57:02 2022 -0400

    Add clarifying comment about where MakeAlias() is used.

commit 19c60244a798658515ce8e22a5c4babf6dd761e9
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 15:56:06 2022 -0400

    More CEREAL_NVP() fixes.

commit d45d8787b93f8cac374972104f3e513363369030
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 15:54:58 2022 -0400

    Fix serialization: use CEREAL_NVP().

commit 194211feec5b5d81eaa34a18ae84088b3462a439
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 15:53:45 2022 -0400

    Fixes for RBFType implementation comments.

commit 9c33d6b780cdc7e42db6f92d2d478deff3a118d9
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 15:52:02 2022 -0400

    Fix header guard name.

commit 880c95bee5f7ea7086a344b946b36c4b8d4cdfb0
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 15:50:50 2022 -0400

    Fix typo.

commit 2fdd4b823b8133f006749339817d9697393aded5
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 15:50:27 2022 -0400

    Add explanatory comment.

commit 416dfc7c4aa18a4a605352b0fd54337f23eb1b7d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 15:48:36 2022 -0400

    Fix inaccurate comment.

commit 19563e3f63f584ab7ba57afab6a041914a02465e
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 14:49:37 2022 -0400

    Update comment.

commit ff3717bc53e9675f604c3c90be1eb0f0dc7de1ab
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 14:41:01 2022 -0400

    Fix typo in comment.

commit ac42ea450b8e5730c05e2c16102729b0b302a08d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 14:40:39 2022 -0400

    Clarify implementations.

commit 20c7b5a836cd1ea88d18ec03ef0fc1b46b5843a7
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 14:37:01 2022 -0400

    Remove unused method.

commit 4fb0dd49659c5b08261cc56d0fb73114dc5bcc35
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 14:36:36 2022 -0400

    Fix includes.

commit 5253b1427a5b4e72ed0416e28e82a1082cdbade9
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 14:28:42 2022 -0400

    Clarify comment.

commit f0aceda809a00a34df32a107090181ad91c8ee32
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 14:27:38 2022 -0400

    Fix serialization for base layer.

commit 2413a1cdf51c195e42691a344266f9c1bf5daf5f
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 14:26:38 2022 -0400

    Add a comment.

commit 2f0a9419eff2113fb657823359cce4e772056846
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 14:26:05 2022 -0400

    Use MakeAlias() instead.

commit c599ecfd1ac633860437155db4f099560a183fd9
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 13:23:32 2022 -0400

    Change BaseLayer names for consistency.

commit 6d69ea45f10f6724561d2dd2a20d58e8d32ba29a
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 13:18:45 2022 -0400

    Clear member that is not serialized on loading.

commit 8f52887fb5b48b6c061fcd602db97aefaa44b424
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 13:16:29 2022 -0400

    Remove inaccurate comments (we did not drop these).

commit cb952537e2d00a8cf7145d411ff640d836d1f631
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 13:15:56 2022 -0400

    Re-add removed files from CMake.

commit c75d03ff6fe60370096a79cdfd1be2f360b67e1a
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 13:14:53 2022 -0400

    Fix line spacing.

commit 8b2e57d32e8d6c107b073e8f46bb5f174732198e
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 13:13:46 2022 -0400

    Add a clarifying comment.

commit 454c746491b62741a548022bda86ebedaf7abb29
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 13:12:52 2022 -0400

    Remove duplicated `training` member.

commit c6443a43d830b601fe3d6f35ad25d7ffa3cba7a5
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 13:05:37 2022 -0400

    Add clarifying comment.

commit 6258a8a9fc69772d5c095aef0ca5689636709461
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 13:02:25 2022 -0400

    Clean up copy and move operators.

commit 07fc52fba3ef21aed85a45df78cfe32da1a2a2b8
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 12:51:28 2022 -0400

    Some clarifying comments.

commit 0ed1971ce042368414d8e135d77dad257d0fa27f
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 12:50:24 2022 -0400

    Some cleanups of Forward() and Backward().

commit cade46d605d9337fc35896f6423a66c193ec245a
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 12:31:27 2022 -0400

    Some extra paranoia about `Network()`.

commit 20bf26651c844115369954eee3315521c627bedd
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Apr 3 12:18:34 2022 -0400

    Use a separate file for forward declarations.

commit 56ef5ded8a693829f55aaaa993969245dfe969c5
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 21:38:09 2022 -0400

    Oops, revert unintentional CMake changes.

commit a1e3a8dbef1366ba7dbd59f16065406d13e3cc9c
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 21:28:20 2022 -0400

    Fix minor merge issues.

commit f52a977dacec04fb3ae13ac6cf973a618bd614bd
Merge: 835499fdb c4bb721ef
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 19:15:59 2022 -0400

    Merge remote-tracking branch 'origin/master' into ann-vtable

commit 835499fdb458b9fe1640dfee56964a29b61e55a9
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 18:39:16 2022 -0400

    Fix comments and remove working comments.

commit 2950e1ff9db3764c60c4df374a10458b75a0bf00
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 18:36:29 2022 -0400

    Add comments to RecurrentLayer implementation.

commit 23d27c9b5dd7b37bd658cbf05f57ced99d99553d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 18:20:59 2022 -0400

    Minor style fixes.

commit e542fd8404457bd27e12a79137c4185fe9fa01d1
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 18:19:44 2022 -0400

    Comment default typedefs.

commit 59ca9428c6955a18fbfbcdebb2769b544b67d337
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 18:16:28 2022 -0400

    Add standardized comment about MatType.

commit cc4e3aec28d8cfa76ad0fe5dc3e36f5d4fd5b8fb
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 17:52:48 2022 -0400

    Use 'Type' and typedef conventions for loss functions.

commit 232ce2a3425cebd672c64bb43cfff0d5f84ece28
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 15:37:55 2022 -0400

    Adapt CustomLayer for testing to have only one template parameter.

commit 9a9b6334a6915844a35418c67e2dad5488c6bd0d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 15:37:41 2022 -0400

    Adapt tests for FFNs and RNNs having only one template parameter for type.

commit 9fe42a2d7b94b04c3f6760ad06ce03371957d14a
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 15:37:21 2022 -0400

    Adapt to use only one template parameter.

commit 6ff20e912b2358cb715f3438baf7325e5c8a4e35
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 15:36:52 2022 -0400

    Adapt to use only one template parameter for types.

commit 48685d34830a25d14e2f7cb46451a48f0d593e7d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Apr 2 15:36:34 2022 -0400

    Adapt to use only one template parameter.

commit d79df403822b57c5b7e7fc1bf83673ad48011404
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 1 19:42:59 2022 -0400

    Don't allow calling Parameters() on a layer with no weights.

commit 6cdd49ab8882800ab0c54b785067d1dca8945598
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 1 18:21:50 2022 -0400

    Some early attempts to adapt some tests.

commit 0d73c5e1e24adbf255d2586c7b08cf549920239a
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 1 18:20:50 2022 -0400

    Early attempts at refactoring Q-learning code.

commit a446c67f5a66108ff0a7ff1af544ab9f61c692a2
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Apr 1 17:53:47 2022 -0400

    Re-enable RLComponentsTest.

commit 263d1ddd15b9d19d0b8b378764521a309deced44
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Mar 30 22:07:37 2022 -0400

    Allow the Rho parameter to be reset (it should also be renamed).

commit bcbeebcf27fe7371b0c4de756fc1881967c8eba0
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Mar 30 22:07:27 2022 -0400

    Adapt CustomLayer.

commit 53d903d1c43620fbe1a94ba8d6d07d4c4565d0f6
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Mar 30 21:17:29 2022 -0400

    Re-add the callback tests.

commit f34513341fc2ba6a6929235346ee91006d82a449
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Mar 30 21:15:36 2022 -0400

    We don't have an arbitrary type; move() should be used here.

commit 5a2f58f5598ac64c8f1e7d4ae0a81640995d3287
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Mar 30 20:03:55 2022 -0400

    Re-enable some of the activation function tests.

commit 0155643e72931e6b59d47a17a9791a63a89d1b34
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Mar 30 19:36:28 2022 -0400

    Forward() should not worry about whether we are in single mode (that's only relevant for training).

commit b3b5c6827a685262ea12897564668dd356042c46
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Thu Mar 31 04:22:53 2022 +0530

    fixed issues in styling

commit ed424ae4a1670379803672f3aa130da207ff2fb3
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Mar 29 21:23:37 2022 -0400

    Remove other bits of boost::visitor things.

commit be9a3882ad568ef55ef49a3979226a65ccd2978e
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Mar 29 21:23:28 2022 -0400

    Comment out RBMNetworkTests.

commit f1f5fc7fa280574aee3f3daedbf4b042675542c7
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Mar 29 21:23:22 2022 -0400

    Fix includes for moved files.

commit 112016ccf18d116dc901825c247cb41995c8a09d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Mar 29 21:22:59 2022 -0400

    Remove boost usage.

commit 2da71b1f36ecbbda72a9531607d0d2af76c62038
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Mar 29 21:22:46 2022 -0400

    Move some files that aren't yet adapted or are unneeded.

commit 1998227d9ce759d477155f875b7d8bcba56f0507
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Mar 29 19:58:28 2022 -0400

    Fix header guard name.

commit f2a1b79e993c3d9c8ec529f06e304532487b13bb
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Mar 27 14:16:58 2022 -0400

    Fix incorrect step reference.

commit 7fa7eb15f0306c2709a5a403c5b9d5838763af2c
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Mar 27 12:16:33 2022 -0400

    Don't forget virtual destructor for LSTMType.

commit a501389238b455bf0b51b114acf20124896d06f9
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Mar 27 12:16:14 2022 -0400

    Make sure to be able to serialize RecurrentLayer.

commit 4c8364c16962208d8f8d0d16db0a002b529a13b3
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Mar 27 12:15:52 2022 -0400

    No need to serialize the training-time-only predictors and responses.

commit c219a57da34952dc33629834c99825eb16a0889a
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Mar 27 12:15:38 2022 -0400

    Reference local `results` instead of class-wide `responses`.

commit 9b7b7f875d50e1e54e86bc8e305e3f5facfba7fe
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Mar 27 12:15:20 2022 -0400

    Add copy and move operator implementation.

commit accce42b35996d1a3dcc5fa239d12d5df8c93cf1
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Mar 27 12:14:55 2022 -0400

    Oops, add RecurrentLayer to the repository.

commit df613499f3cb3238c2aba204a20fa32c732aa714
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Mar 25 11:59:49 2022 -0400

    Adapt test to use multiple epochs.

commit aa0ce85656745bd6d9a59a24302b1b2985b84eb6
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Mar 25 11:58:43 2022 -0400

    Fix alias computation.

commit 20b109fe42ae46cb4117962050075cf55737603d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Mar 25 11:57:57 2022 -0400

    Set step correctly for forward pass.

commit 9ff1fb11f5e1539bcee2e51036ff6b6152e0ba33
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Mar 23 22:40:52 2022 -0400

    Test that the RNN and FFN give the same output for only one time step.

commit 66d929f92380644fd826c200e04177c0eec31fa9
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Mar 23 22:40:42 2022 -0400

    Handle series with only one time step.

commit 0c97b42d807d5739cd9a791833347d446a544d98
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Mar 23 17:58:28 2022 -0400

    Be sure to serialize rho and single too.

commit 76992e86d6be1ff91170a73c9489d845d358967f
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Mar 22 21:26:19 2022 -0400

    Add serialization for some more initializations.

commit d6e200ff37014105af2c9b09d2821f2bc11ace2f
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Mar 22 21:26:09 2022 -0400

    Fix include issue (this may not be the right fix).

commit 080dcfccb470251badda157887873d97d1b3405f
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Mar 22 21:25:59 2022 -0400

    Correct NumFunctions() implementation.

commit 4816d7ccfe1a7b36cd76c60db19a635678cc1c6a
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Mar 19 23:48:10 2022 -0400

    Some other minor bugfixes; now the tests pass.

commit 80efe31e5bb42f3f50ee33aa4d40364ba25215ad
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Mar 17 22:06:01 2022 -0400

    Fix a few bugs---now LSTMBatchSizeTest works!
    (That doesn't mean that RNNs actually work...)

commit eab019541d44a23c6dd4dbb123cc3d463759dd05
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Mar 17 21:12:19 2022 -0400

    Comment out tests that don't work yet.

commit 9ff8fd401d4fada458313152c55e041622d37fc6
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Mar 17 21:11:47 2022 -0400

    Clean up (in some places) RNN implementation.

commit 5683066a660c948ef7b5363467191693ea3739ce
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Mon Mar 14 22:25:27 2022 +0530

    fixed styling issues

commit 68e40a6b23b4fcea525f4b7692c28195ac1aa67d
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Fri Mar 11 22:50:03 2022 +0530

    fix build issue by removing row vector assert condition

commit ca353778a444bc826daed20c10438d4fe4277c89
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Fri Mar 11 20:15:07 2022 +0530

    fix failed build

commit 6883c30cb263f3226f664c58cebe70f124e64e97
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Fri Mar 11 19:50:54 2022 +0530

    fixed redundancy in size-checks

commit 2685d6909743c650137b1e31558c5b818e0d6b01
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Fri Mar 11 19:30:22 2022 +0530

    fix matrix completion size-checks

commit f4db66192bf2d9210265f02fcce3b0b0b27653ac
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Fri Mar 11 12:19:37 2022 +0530

    remove incorrect checks in adaboost

commit 05d74baffd841e09a607bf3b03e721d442d29e59
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Thu Mar 10 16:30:49 2022 +0530

    fix size checks

commit 54f65060158fd883bb43fcb72d7062ce60c26cd3
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Thu Mar 10 12:24:15 2022 +0530

    fix styling issue

    Signed-off-by: eshaanagarwal <eshaan060202@gmail.com>

commit 8714c77a959e708a6100d6c1a4813c086185bcff
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Wed Mar 9 02:33:08 2022 +0530

    Add: size checks for kmeans and linear regression

commit 0a0681f2a578188f2b0ce0b878a3923e7accfcb6
Author: eshaanagarwal <eshaan060202@gmail.com>
Date:   Tue Mar 8 02:27:38 2022 +0530

    Add : Size checks for adaboost and matix completion

    Signed-off-by: eshaanagarwal <eshaan060202@gmail.com>

commit bd08f9a92ec85a01b422d6c2e82329e5c3a4dc48
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Feb 16 22:18:54 2022 -0500

    Remove debugging output.

commit d8b72a862bbda1853613f3cd3e0a1a041a6fc27e
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Wed Feb 9 22:14:32 2022 -0500

    Some RNN refactoring.

commit 93fecce9c80aa13e4e940e6e94069923681dded0
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Feb 15 23:00:08 2022 -0500

    Significant cleanup of all layers.

    The ones not in not_adapted/ (other than the LSTM) are ready for review.

commit ce12cd3e25a9f53aec04f5c1f2161b9b42a57bdc
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Feb 9 22:44:07 2022 -0500

    Remove files from CMakeLists.txt.

commit ee117f355b491cbcfe4cb3b27ca18fe6a079b0a3
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Feb 9 22:43:38 2022 -0500

    Remove layer_traits.hpp since it's no longer needed.

commit 7b969f3014d8c575decf2436a99052c75fd06837
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Feb 9 22:41:54 2022 -0500

    Move two layers that got missed.

commit e2502cbb9c9b42a6787c9d24fdfc4f5063e26258
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Feb 9 22:41:09 2022 -0500

    Move unadapted layers into a separate directory (for organization).

commit f72340106ce78ec632925f8198fc7d6102b78f54
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Feb 9 22:09:08 2022 -0500

    Add some documentation about what will happen.

commit 637b93ddc048e23c74500fe62bc2e7ba7ef7a9bd
Merge: 2065e17df cf190e11f
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Feb 9 22:05:56 2022 -0500

    Merge remote-tracking branch 'origin/master' into ann-vtable

commit 2065e17df8467d83b91295b623a8b8810a14c2a3
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Feb 9 21:45:29 2022 -0500

    Add a test for uneven stride.

commit c532632ce970c6294ec151cd29d9cf6ae015788a
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Feb 5 20:41:43 2022 -0500

    Remove comments that turn out to be unnecessary to address.

commit 5e09ed16715e4b914e6dd172f093523ae1ca2033
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Feb 5 18:08:06 2022 -0500

    Fix implementations of NaiveConvolution for stride and dilation.

commit 5cd0e38eb6cbf5f0316e0f6ee7376b1ea783ad30
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Feb 5 18:07:39 2022 -0500

    Add tests for different strides and dilations.

commit 8c1a8185408328a0ab474c0dd9dc69186cfe5878
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Feb 2 22:55:35 2022 -0500

    Fix stride usage for backwards pass.

commit 110ed75b6f81022b94ea0f29db42b5bf9be881e7
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Feb 1 18:06:55 2022 -0500

    Fix some minor issues and merge problems.

commit a77d29021436e956eb6cd2e106e6a546f7172fdf
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Jan 27 16:28:22 2022 -0500

    Add/fix final set of copy/move constructors/operators.

commit 11cc9a122b55095446662ad598617dba8356a2e1
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jan 26 22:29:45 2022 -0500

    Add and fix a bunch more copy/move constructors/operators.

commit ec06bcbbffc4f6cdb49af7c269b5acd85db5db87
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Jan 25 22:44:34 2022 -0500

    Start implementing copy and move constructors correctly.

commit f8421110394df1b8f1abdbf3826de683f5990cde
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Jan 25 22:44:25 2022 -0500

    Set sizes correctly for test.

commit 3f5e06e9c5fbe2b5c3d00f968f2a930bebbfd2e8
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Jan 22 11:20:35 2022 -0500

    Restructure PaddingTest for slight behavior changes.

commit abb1450102bdd748c229eedc15f6887dcb86d429
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Jan 21 23:32:35 2022 -0500

    Fix minor bugs in shape computation.

commit d600ece415fc5462956670462971d6c2b60e0164
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Jan 21 23:03:24 2022 -0500

    Fix failing padding tests.

commit 94b3294a5d250eb65ec175d4e09741bc5b31683b
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Jan 21 14:40:53 2022 -0500

    Fix some merge issues.

commit c3caed04907c0938d0e7e8becede9af7357a97f2
Merge: 09a28caa9 3264cd87e
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Jan 20 20:23:26 2022 -0500

    Merge remote-tracking branch 'origin/master' into ann-vtable

commit 09a28caa902b5e38010f6bd103363cd28317b51e
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jan 19 17:41:18 2022 -0500

    Some minor bugfixes to FFN.  Some need further cleanup.

commit 96f97421f2e3cf938b6fadc9726ff25c2a0fd822
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jan 19 17:16:57 2022 -0500

    Fix bug in Gradient() implementation.

commit 6dc0596ad6b58204857c16a2eb04851ed76f0d0b
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jan 19 17:16:44 2022 -0500

    Consider bias term in gradient.

commit 9a71a61aac3e960a79481128ac96bdba82a5b085
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Jan 7 18:16:18 2022 -0500

    Fix setting of inputDimensionsAreSet.

commit a4ee2545f6f915634bb738b6051870afd88e8032
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Jan 4 20:12:34 2022 -0500

    Add a sanity check test.  It passes.

commit f5c4b9586070497008ce5e47a3ec8d6034558e17
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Nov 28 21:47:40 2021 -0500

    Remove debugging output.

commit d8fef7eed6b09594261f2505e0e2f5a0b356cfc0
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Nov 28 21:47:17 2021 -0500

    Adapt to new mlpack 4 conventions.

commit bda6e0969f3ee43dffb9e48735655b5af914c7d7
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Nov 28 21:46:59 2021 -0500

    Fix failing tests.

commit 8d1c05c36f1ec99f2ae37e2d7d2fc6cb94166f12
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Nov 28 21:46:44 2021 -0500

    Update to mlpack 4 conventions.

commit ead043c9b88945213ae408848c2b78189d28d13e
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Nov 28 21:45:41 2021 -0500

    Oops, make sure to add the bias in the forward pass.

commit eae19f7c28b11c63a360856463b5b71f16dda839
Author: Omar Shrit <omar@shrit.me>
Date:   Tue Nov 16 17:43:09 2021 +0000

    Finishing the Alphadroput layer, tests are passing

    Signed-off-by: Omar Shrit <omar@shrit.me>

commit c19e39a152f36beea3b322c32327ec764c18897f
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Nov 14 22:30:39 2021 -0500

    Turns out the MaxPooling adaptation I did was wrong---this seems more correct.

commit a1fd4829a167591a35ea489ba4c7f67308977044
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Nov 14 22:30:15 2021 -0500

    Redo Convolution layer implementation.  I think this is right but not 100% sure.

commit ca010221e9432ba8f9aedb8b221e48dfae7a60c7
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Nov 14 22:29:26 2021 -0500

    Enable LeakyReLU layer.

commit 49954574e998ee136d5541d58e98436a3bcd8d35
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Nov 1 19:17:24 2021 -0400

    Add gradient test for Convolution layer... but it seems to work?

commit be5b4e5e016a23afe7d1856dee500710c04084e2
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Oct 20 17:46:03 2021 -0400

    Fix shape of input to reflect the number of input maps.

commit 19f06709d47ec9f2023649260337cf77a21dd640
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Oct 19 10:18:14 2021 -0400

    Use MakeAlias() to avoid accidental copies.

commit 081c32593d7e97b07439b7fb3f8eac0d3d007031
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Sep 28 11:35:41 2021 -0400

    Bias should be one per output map.

commit 58f1718be1c7893c9e23ee592c42d70268f655d6
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Sep 28 11:22:07 2021 -0400

    Huh, it seems like this fixes ConvolutionLayerPaddingTest.

commit 6fff4457c6f724bb60f1b2141fe2f20efd02c58c
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Sep 20 11:09:36 2021 -0400

    Fix two more tests by making sure the inputs and outputs are right.

commit ad2cb74d0553affb513ebe23be2d77f105fb02c7
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Sep 19 00:10:20 2021 -0400

    Fix some tests.

commit 051205d78f8a82ff90cba7b693050aac84d08ff5
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Sep 15 22:12:20 2021 -0400

    Now at least the tests don't segfault. :)

commit 1152c75d2341c21ccce9a69fe9eaf0807370291e
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Sep 15 13:59:43 2021 -0400

    Update some ann_layer tests.

commit f05590b8785da7c173bc08ca18ecd79f0b04b2ab
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Sep 15 13:59:29 2021 -0400

    Fix convolution.

commit 1cbdaa8ac5fb9b4067b64838e7b3eda8d27f4564
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Sep 15 13:58:09 2021 -0400

    Fix bug for uninitialized output.

commit 3a00d82a23608980312d21b6a803e75bef665ee9
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Sep 9 12:02:31 2021 -0400

    Some cleanups for the convolution layer.

commit 72c0f58a27dc8160391407bc8324ea28d7145ad6
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Sep 9 12:02:06 2021 -0400

    Fix some incorrect dimension usages.

commit a46542b25f0b8af1dd87cf3a13812554fbe16f51
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Sep 7 18:05:04 2021 -0400

    Too much writing Julia...

commit 06f7b34d7374acad4cbc4cf0432f35172beff5d2
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Sep 3 18:43:21 2021 -0400

    Fix max pooling bug.

commit c9df713091ee2f51f211a56e8d49ec5a7f2a8f29
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Aug 21 10:44:43 2021 -0400

    First attempt at refactoring Padding, MaxPooling, and Convolution.

commit 498afa0ab77f00f5d802e2d1d72bd5486bf682c0
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat Aug 7 11:15:14 2021 -0400

    Fix various bugs in the MultiLayer implementation.

commit 8105bb8583a3050174f22fb29f16ca36e81a256a
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Aug 4 21:19:54 2021 -0400

    Refactor to use MultiLayer inside an FFN.

commit 5f7a9cd684d9a1fd3b48ad8662655c92f60d2e82
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Mon Aug 2 18:19:58 2021 +0200

    Filter some 'unused' layers.

commit 8a4e5b9bdfe1554ccb95d6983d5bbd64f76b1ad9
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Aug 1 21:42:09 2021 -0400

    Refactor Highway (and fix MultiLayer).

commit 0375e7d57562f912c569c1130f5e0b079c6d8474
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Aug 1 21:42:00 2021 -0400

    Extra paranoia to avoid include boost::visitor...

commit d3972c16da12291b85954e7105e1ee5c23186959
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Aug 1 21:41:49 2021 -0400

    Just make sure boost isn't included...

commit 57884df9f6423a3b7bf2be7ba9b6c432e142f800
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Aug 1 21:41:33 2021 -0400

    Split out into convenience function.

commit d136ac3df302151a4a73c658252365ddd454fa56
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Aug 1 21:41:00 2021 -0400

    Hey, this is no longer needed! :)

commit be7eeb134d0cc4fd6e5ee74e3c736e6a2c4aa777
Merge: ce3c3bcee e3f4654a8
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Jul 23 16:45:08 2021 -0400

    Merge branch 'ann-vtable-attempt' into HEAD

commit e3f4654a8ab4c885d232ad35d15e3e3ca1a5243b
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Jul 23 16:43:46 2021 -0400

    Update RBF<> layer so tests pass.

commit ce3c3bcee6b08c9c3d79fed1c31b367897c37775
Merge: 34cf419bb 927fabff8
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Tue Jul 20 08:56:37 2021 -0400

    Merge pull request #2 from rcurtin/ann-vtable-attempt

    Further refactoring of ANN to remove boost::visitor.

commit 927fabff8afdc2480180905546dbe5ae038fc316
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Jul 13 17:53:33 2021 -0400

    Adapt the last commented test in FeedforwardNetworkTest.

commit ed8881d2b6200e54828f0efbd2b7b2840838ee71
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Jul 13 17:48:46 2021 -0400

    Uncomment another test.

commit b05736d42ccc7b99f33e9fee8f9157a18d71e09f
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Jul 13 17:48:34 2021 -0400

    Make sure Parameters() returns the correct thing.

commit 522ebd11fe8734dfdeae63902275215fc9f894ed
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Jul 13 17:48:27 2021 -0400

    Adapt AddType<>.

commit c8c0797c9d26f5edda44007b3da247ebbf1f3582
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Jul 13 17:19:57 2021 -0400

    Adapt a few more layers, and uncomment some more tests.

commit 2b7429594f523588763a4cfbee75eea109ac78a1
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Jul 13 16:47:06 2021 -0400

    Oops, I didn't really need to refactor this, but it might work.

commit ab67249f9fc04be089bead0e064f6462d328c653
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Jul 8 17:49:57 2021 -0400

    Fix additional warnings.

commit a5bb31b82421d9481ea254b754e11a79aedf9eb0
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Jul 8 17:38:04 2021 -0400

    Remove debugging output.

commit 188759042cc83e26429014387273a982dac3a3d0
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Jul 8 17:37:42 2021 -0400

    Fix a compilation warning.

commit 130890c85d3c178d6fd119d10046fe84a01a6735
Author: Ryan Curtin <ryan@ratml.org>
Date:   Thu Jul 8 17:36:09 2021 -0400

    Some additional refactoring and cleanups.

    Notably, the adapted layers no longer need an input size.

commit 32b19eb7efe99daf2b783f0aeec9e5432521a341
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jul 7 19:08:32 2021 -0400

    Refactor Reparametrization layer.

commit d180cc36393c1bfea4bd66d8a9df0efb05c8e082
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jul 7 19:08:12 2021 -0400

    Serialize output dimensions also.

commit a87672665caaf8d04ce4e8a95b8c799035cff049
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jul 7 18:24:15 2021 -0400

    Remove unnecessary copy/move constructor/operators.

commit af68997a6cdf0fcdea18db218fbd0a98ddf77b3f
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jul 7 18:22:30 2021 -0400

    This function should be const.

commit 60fe292da9f0f2e328d8c2110b27799cf843510d
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jul 7 18:22:10 2021 -0400

    These are all the default versions anyway (but don't consider inheritance...).

commit 30e4ff741b6bd16cb756d5205d744f36d0a11f44
Author: Ryan Curtin <ryan@ratml.org>
Date:   Tue Jul 6 16:22:37 2021 -0400

    Update comments.

commit 7f7f56481f951383edff9cc1c8097fc710e7096c
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Jul 5 21:53:33 2021 -0400

    Remove unnecessary functions.

commit 30aca349cff4054bf66671882297e000ba585c5c
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Jul 5 19:26:55 2021 -0400

    Fix train/test modes.

commit ffba7a966c069aef7f6675d4abc971a4636d7a35
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Jul 5 18:52:20 2021 -0400

    Remove unnecessary utilities.

commit a3dd3739c631c9a9ce6a2ab91e74b9231ae90d46
Author: Ryan Curtin <ryan@ratml.org>
Date:   Mon Jul 5 18:51:18 2021 -0400

    Change 'deterministic' to 'training'.

commit afa76d6ae8c503ce5d823fc17251bc015f20f653
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sun Jul 4 16:05:47 2021 -0400

    Make sure that we save layerOutputs.back() in case we need it later...

commit 2875194721172dfdd0f7ac4f1a746fb85b3dc750
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jun 23 19:58:38 2021 -0400

    Fix FFNReturnModel test.

commit 292bfbef9ca1e6b6d0b5ff26b8afaba435d9f28e
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jun 23 19:32:34 2021 -0400

    Serialize deterministic in whatever state it is currently in---no assumptions.

commit a30e15fca7afef8b1eac736cc8531a744297430b
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jun 23 19:32:20 2021 -0400

    Remove unnecessary output.

commit 5d27a4d43ed4399247075891c36bd026a2dbb71c
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jun 23 19:24:36 2021 -0400

    Add the 'MultiLayer', although maybe we can just use the FFN class itself?

commit c2b54b7af68eb38eac11f08bedffd6342dbf7ef0
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jun 23 19:24:21 2021 -0400

    Set the size correctly in Predict() and fix a few other errors.

commit 0c307e89dee16541a828756967e646daebe74dcd
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Jun 18 14:11:47 2021 -0400

    Initialize totalInputSize and totalOutputSize in the right place.

commit d2fd462a8c8f52c3a89098173860f30c6dd058f7
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Jun 18 13:51:46 2021 -0400

    Use aliases for layer outputs and deltas.

commit 064cb7b2966e0072f795c06fe7799bfcdd9e772e
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri Jun 18 12:42:29 2021 -0400

    Okay, this passes FFVanillaNetworkTest!

commit 9a521c8080d6bdaddd4674b7c74052b5b3c66403
Author: Ryan Curtin <ryan@ratml.org>
Date:   Wed Jun 16 14:07:23 2021 -0400

    Step 1: something compiles at all.

commit 5220de7d143e911bd5dc5c91e78ba80d9c22bdfc
Author: Ryan Curtin <ryan@ratml.org>
Date:   Fri May 28 12:52:21 2021 -0400

    Fix some minor compilation issues.

commit f8123469e9e1aa82e25c4b3a692944a72c662df4
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat May 22 05:27:37 2021 -0400

    Add serialization file.

commit 1b3ea01b45f5059df3dfffbe009d588a8f3adab9
Author: Ryan Curtin <ryan@ratml.org>
Date:   Sat May 22 05:19:41 2021 -0400

    In-progress, does not quite compile yet.

commit 34cf419bb86658f137ee2414879e4be9e514d9f3
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Sun Jan 31 19:05:00 2021 +0100

    Update FFN tests to use the base layer class.

commit b700f8d311df2e04a3646cf4b752f16185da939b
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Sun Jan 31 04:36:35 2021 +0100

    Update FFN copy/move constructor tests to use the layer base class.

commit eafac8609db48d6b4e2002add74cae1255e58c3d
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Sat Jan 30 04:39:08 2021 +0100

    Add Clone() function which handles polymorphism correctly.

commit 0c4db57106e72bee2bfb9e392e744073fda79c6e
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Fri Jan 29 10:14:53 2021 +0530

    typo fix

commit 21057324f3f001b5a3fb633d3ce8b0e5d519d478
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Thu Jan 28 16:21:27 2021 +0530

    add ResetCell and Reward methods to base class and add method to push layer to ffn model

commit 43e7639e614ffada15121ae3f2f9d09590f0f6c7
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Tue Jan 26 00:47:55 2021 +0100

    Use layer base class for the network initialization.

commit fadaaa59a27188377122959b651bdd1a14e41b49
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Mon Jan 25 00:04:26 2021 +0100

    Update layer to use updated layer base class interface.

commit 3cdd972be8ecf413eb40ceef9a526d676d984382
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Sun Jan 24 23:17:26 2021 +0100

    Adjust determenistic parameter interface.

commit 5b453b712a193fca5dc8161b3891a4bc1bc78c33
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Sun Jan 24 23:14:04 2021 +0100

    Add utiliy functions to update layer parameters and states.

commit c641bc527c1034a54a6574e5d56b621083d36c7c
Author: Marcus Edel <marcus.edel@fu-berlin.de>
Date:   Sun Jan 24 23:13:09 2021 +0100

    Restructure FFN class to use the layer base class.

commit 9c29e5b6031b41a796ede96bad2852021afa3caa
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Tue Jan 19 21:17:32 2021 +0530

    update weight_norm layer to use abstract class, and some other fixes

commit 5d62339e02e48440124776ae7d300a3b4715ca92
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Sun Jan 17 18:04:32 2021 +0530

    update fast_lstm to use abstract class (without unit test)

commit f2f84991ce434bf9a45aa82b1f84533409102631
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Sun Jan 17 17:30:36 2021 +0530

    migrate vr_class_reward to loss_functions

commit ecd2fae814a59bd23fe615037c7bd696c590ddc9
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Sat Jan 16 10:32:18 2021 +0530

    update reinforce_normal and reparametrization layer to use abstract class

commit 826be5b404357521ea3c1643d94eba96cf12b383
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Fri Jan 15 23:29:21 2021 +0530

    update virtual_batch_norm to support abstract class

commit f694114805e295bcb3a2358322e5f1875e7d68e5
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Fri Jan 15 23:14:22 2021 +0530

    update select, subview, padding and transposed convolution to use abstract class

commit dfec0f787d90c70c9a64e61dc0f2caf09fb1e365
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Fri Jan 15 18:28:43 2021 +0530

    updated sequential layer to use abstract class

commit c6675f4ba33272d389719a6d26f14443cae0bbb8
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Fri Jan 15 16:34:30 2021 +0530

    update concat and highway layers to use abstract class

commit 0da7330f012525585502d7962c4ff0c51b2dd2b3
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Fri Jan 15 09:11:57 2021 +0530

    update positional_encoding and multiply_merge layers to use abstract class

commit c30d9ed85c98063885acc8200907e5243d1dba7b
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Fri Jan 15 07:18:52 2021 +0530

    slight style fixes and lexicographical ordering of layer_types

commit cfdb84173aa6a2eedaf1f105f9a96db90d0ca2d4
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Thu Jan 14 11:22:58 2021 +0530

    some corrections and updating layer_norm, max_pooling and mean_pooling

commit 97d865df79f7dad9d0bcd97480af2e6e7f477040
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Thu Jan 14 10:25:47 2021 +0530

    correction in convolution and update join and glimpse layer to use abstract class

commit 270aab791eee2efc58e27c17ad06d81982764278
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Thu Jan 14 10:05:39 2021 +0530

    update glimpse layer to use abstract class method

commit 28d56b1693d8245c241b3dfe9fcf4a8fb7595988
Author: Mrityunjay Tripathi <mrityunjay2668@gmail.com>
Date:   Wed Jan 13 10:52:22 2021 +0530

    corrected documentation and updated convolution layer

commit 09e299534615628b02e79aa92fe18ce…
This was referenced May 23, 2022
@AdarshSantoria AdarshSantoria mentioned this pull request Feb 14, 2023
This was referenced Mar 3, 2023
This was referenced Mar 10, 2023
@IWNMWE IWNMWE mentioned this pull request Mar 27, 2023
1 task
@IgorWojas IgorWojas mentioned this pull request Mar 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Roadmap
mlpack 4.0.0
Development

Successfully merging this pull request may close these issues.

None yet