-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tests for Stochastic Optimization #894
Comments
I would like to work on this one. I have started reading the paper. I would create a PR soon. |
Read the paper kind of understood it. So, I make any error surface using now the combination fo these basic shapes right. And when I run let's say a sgd on it. Then which evaluate function should called.(Does it depend on xs <= x <= xe). After that we evaluate the loss on this local function and the take the step in the gradient of this local function is that it. |
You are right, right now we have some simple functions to test the optimization algorithms like: https://github.com/mlpack/mlpack/blob/master/src/mlpack/core/optimizers/sgd/test_function.hpp which are great, but this ticket would allow us to define a bunch of different test function just by combining different function generators. Let's say you like to test it on a non-convex gaussian line function; All you need is a Line function generator and another generator that creates the non-convex part of the gaussian function. |
@zoq Can I work on this one? No one has made a PR yet. |
@sinjan25 you can work on this too. I will be implementing the LineFunction Shape Prototypes. Maybe you could do some NonConvex Shape Prototype. |
@zoq @kris-singh Alright! I will then be working on a genarator that creates the non-convex part of the gaussian function. |
@zoq i implemented the line function. I wanted to now we would write a decomposable function. Which would have activate the function based upon the x1 given right. so we would need to check every time. can this checking be done in compile time. Or how should i go about implementing the decomposable function. Also in sgd implementation we have been given the iterate(which basically can be training points or functions) but what I dont understand is if these are function, then what are the points these functions are evaluated on. |
@zoq can you answer this. Thanks |
We have to provide a Function that is able to combine different generators and that provides the same interface as e.g. the RosenbrockFunction function. x1 is a single input point or vector that has to be specified at build time. Right now we use GetInitialPoint() that comes with each function and provides some input for the evaluation. I'm not sure what you mean with checking of x1, can you elaborate further? |
Hi! |
Hello @shikharbhardwaj there are a couple of different and interesting generators that could be implemented, so feel free to work on this issue. |
Thanks for the reply @zoq. How exactly would the generator interface look? The line function interface you posted generalises the line function over the arbitrary number of dimensions the user wishes to use. The way I was thinking about the problem was to define the function prototypes in a single dimension and then use something like a Any thoughts? |
Covered by: #1151 |
mlpack implements a some gradient-based optimization algorithms; each method is tested on some simple function to test they work as expected for that particular function.
Since implementing meaningful tests is probably the most underrated task which often takes more time as implementing the method itself, it's 'fine' if the optimization algorithms are able to solve some simple functions. The goal of this ticket is to implement a conglomerate of some additional functions to test the optimization algorithms.
In "Unit Tests for Stochastic Optimization" Tom Schaul et al. proposed a bunch of really interesting basic functions and combinations that we could use as a basis for this ticket. A start could be the implementation of some simple generator e.g. a simple Line function generator that can be combined to create other functions:
The text was updated successfully, but these errors were encountered: