Kullback-Leibler projections for Bayesian model selection in Python
-
Updated
Apr 12, 2024 - Python
Kullback-Leibler projections for Bayesian model selection in Python
[CVPR 2023] Modeling Inter-Class and Intra-Class Constraints in Novel Class Discovery
Experiments of the three PPO-Algorithms (PPO, clipped PPO, PPO with KL-penalty) proposed by John Schulman et al. on the 'Cartpole-v1' environment.
PyTorch implementations of the beta divergence loss.
Non-Negative Matrix Factorization for Gene Expression Clustering
This project implements in Python some common statistical analysis methods used in data analysis, including Entropy, Mutual Information, Kolmogorov–Smirnov test, Kullback-Leibler divergence (KLD), AB tests (Mann-Whitney U and t-tests)
Particle Filter tracker and square-shape detection
TATTER (Two-sAmple TesT EstimatoR) is a tool to perform two-sample hypothesis test.
Building a corpus whose unit distribution is approximately the same as a given target distribution by using a greedy algorithm with the Kullback-Leibler divergence. Can be used for Text-To-Speech synthesis application.
Giant Language Model Test Room, most up to date
Mode Selection/Covering of GANs (TensorFlow 2)
A machine learning model to classify workouts performed in videos and predict an effectiveness metric for evaluating the performed workout
Basic study of information theoretic measures and stochastic processes.
Add a description, image, and links to the kullback-leibler-divergence topic page so that developers can more easily learn about it.
To associate your repository with the kullback-leibler-divergence topic, visit your repo's landing page and select "manage topics."