Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ARD and Gradient Search Vector Learning #649

Merged
merged 8 commits into from
Jul 18, 2012

Conversation

puffin444
Copy link
Contributor

This patch supports Gradient Search on Vector elements. An application is the Linear and
Gaussian ARD Kernels. Results seem to be similar to that of GPML

  1. In GaussianARD and LinearARD, when I calculate the kernel gradient, should I constantly
    check if the left and right hand features are NULL, or has that been done already in the superclass?
  2. What do you think of these two classes? Do you think I can consolidate code through inheritance? They are very similar with the exception of compute() and the gradient calculation.
  3. Right now ModelSelectionParameters can generate ParameterCombination through
    get_single_combination. However, it doesn't yet through get_combinations, mostly due to point (4) below. Heiko, do you have any suggestions on integrating vectors/sg_vectors for the get_combinations function?
  4. Right now the ModelSelectionParameter framework generates vectors for ParameterCombinations by using a pointer to a vector allocated somewhere else. The problem with this approach is if I wanted to call get_single_combination twice to get two trees. These two trees would have exactly the same vector, and it would be impossible to generate two trees with vectors with different values. Should I try to allocate new vectors in get_single_combination?
  5. The Sigma gradient seems to have a large effect on the gradient at small values. This throws off the check_gradient() function at times. Is it possible to adjust the delta and error_tol values based on the actual values of the parameters?

puffin444 added 6 commits July 14, 2012 16:48
Conflicts:
	src/shogun/evaluation/GradientResult.h
	src/shogun/kernel/GaussianARDKernel.cpp
	src/shogun/kernel/LinearARDKernel.cpp
	src/shogun/kernel/LinearARDKernel.h
	src/shogun/modelselection/GradientModelSelection.cpp
	src/shogun/modelselection/ModelSelectionParameters.cpp
	src/shogun/modelselection/ModelSelectionParameters.h

using namespace shogun;

void print_message(FILE* target, const char* str)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

btw, if you init shogun with init_with_defaults() you dont need this method, take a look at other examples

@karlnapf
Copy link
Member

1.) I think these checks hurt nobody
2.) Yes I think inheritance might be good here, reduces code and therefore error possibilities
3.) not yet, I will think of it
4.) same, lets talk about this soon
5.) dont really get what you are asking here :)

puffin444 added 2 commits July 17, 2012 15:57
Conflicts:
	examples/undocumented/libshogun/regression_gaussian_process_ard.cpp
	src/shogun/evaluation/GradientResult.h
	src/shogun/kernel/GaussianARDKernel.cpp
	src/shogun/kernel/GaussianARDKernel.h
	src/shogun/kernel/LinearARDKernel.cpp
	src/shogun/kernel/LinearARDKernel.h
	src/shogun/modelselection/GradientModelSelection.cpp
@puffin444
Copy link
Contributor Author

Okay. Cleaned some stuff up. What do you think?

karlnapf added a commit that referenced this pull request Jul 18, 2012
ARD and Gradient Search Vector Learning
@karlnapf karlnapf merged commit 1e888a0 into shogun-toolbox:master Jul 18, 2012
@karlnapf
Copy link
Member

I think its nice!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants