-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Example notebook: kernel ridge regression #320
Conversation
# $$ | ||
# \mathbf{w} = (\mathrm{X}^\top \mathrm{X} + \lambda \mathbb{1})^{-1} \mathrm{X}^\top \mathbf{y} | ||
# $$ | ||
# using the [matrix inversion lemma](https://tlienart.github.io/pub/csml/mtheory/matinvlem.html#basic_lemmas) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe rather link to the Wikipedia article about the Woodbury matrix identity?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I still think it would be more natural to link to Wikipedia?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I found the blog post more instructive/helpful than the wikipedia article (and the latter is straightforward to find for anyone who wants it)..
…tions.jl into st/examples--kernel-ridge-regression
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have no objection to this being merged once CI is made to pass.
else | ||
title = string(nameof(typeof(kernel))) | ||
end | ||
scatter(x_train, y_train; label=nothing) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't know that nothing
works, I always use label=""
to hide the label for a specific series. However, it seems you want to remove labels for all series here in which caseI guess you can just use legend=false
scatter(x_train, y_train; label=nothing) | |
scatter(x_train, y_train; legend=false) |
is simpler.
p = plot!( | ||
x_test, | ||
y_pred; | ||
label=nothing, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not needed if legend=false
:
label=nothing, |
Co-authored-by: David Widmann <devmotion@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
This looks really good! Great job! |
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…iaGaussianProcesses/KernelFunctions.jl into st/examples--kernel-ridge-regression
|
||
## | ||
# ## Kernel ridge regression | ||
# Instead of constructing the feature matrix explicitly, we can use *kernels* to replace inner products of feature vectors with a kernel evaluation: $\langle \phi(x), \phi(x') \rangle = k(x, x')$ or $\mathrm{X} \mathrm{X}^\top = \mathrm{K}$, where $\mathrm{K}_{ij} = k(x_i, x_j)$. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here talking about XX^\top=K is a bit misleading, it does not sound like we are using any feature mapping, maybe \phi(X)\phi(X)^\top
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
well, X
is the "extended" x
with feature columns, each column of X is one \phi applied to the input x
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but it's a bit ambiguous, you're right. maybe call "featurized X" \tilde{X}
instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
NB: XX' = K
is exact for the LinearKernel k(x,x') = xx' :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With no constant :)
No description provided.