You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hi. wanting to learn a diffusion map with a custom kernel.
I have my own Gram matrix. (can't explicitly build the feature vectors...)
are you able / can I make a PR to allow this to take as input a Gram matrix whose element (i,j) is k(x_i, x_j) with k the kernel?
functionfit(::Type{DiffMap}, X::AbstractMatrix{T}; maxoutdim::Int=2, t::Int=1, α::Real=0.0, ɛ::Real=1.0, custom_gram::Bool=false) where {T<:Real}
if! custom_gram
sumX =sum(X.^2, dims=1)
K =exp.(-( transpose(sumX) .+ sumX .-2*transpose(X) * X )
else@assertsize(K)[1] ==size(K)[2] "Gram matrix must be square!"@assertall(K .>=0.0)
@assert K == K'end
L = K ./convert(T, ɛ))
thx, wanted to change this for our research project anyway so glad I could try to help out. let me know if you want me to change anything or have improvements you want me to make.
[we are using it for when the gram matrix is from a graph kernel so the X input argument does not work for us.]
hi. wanting to learn a diffusion map with a custom kernel.
I have my own Gram matrix. (can't explicitly build the feature vectors...)
are you able / can I make a PR to allow this to take as input a Gram matrix whose element (i,j) is k(x_i, x_j) with k the kernel?
https://github.com/wildart/ManifoldLearning.jl/blob/master/src/diffmaps.jl#L66
something like the below?
https://inside.mines.edu/~whereman/talks/delaPorte-Herbst-Hereman-vanderWalt-DiffusionMaps-PRASA2008.pdf
going off of the above reference.
edit
I now understand that
will normalize the rows :)
thx!
The text was updated successfully, but these errors were encountered: