You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was comparing Variational Logistic Regression and Relevance Vector Classifier, and though RVC seems to be more complicated model it is much faster to fit than Variational Bayesian Logistic Regression? Is there any implementation problems?
The text was updated successfully, but these errors were encountered:
Relevance Vector Classifier uses Laplace approximation which is much faster than Local Variational Approximation used in Variational Logistic Regression (however it is less accurate at the same time).
Main difference (except ARD prior) is that Variational Logistic Regression needs to optimize latent local variational parameter for EACH OBSERVATION, so obviously it should slow for large datasets.
As a general advice it is better to use Laplace approximation in case you have large number of samples,
for smaller datasets it is preferable to use Local Variational Approximation.
You are still right VLR is very slow for high dimensional inputs. I updated code, now instead of using
pseudo inverse I use cholesky decomposition, this allows to avoid costly dot products and makes code a bit faster.
If you think there are other places for improvement let me know!
I was comparing Variational Logistic Regression and Relevance Vector Classifier, and though RVC seems to be more complicated model it is much faster to fit than Variational Bayesian Logistic Regression? Is there any implementation problems?
The text was updated successfully, but these errors were encountered: