This repository contains the code to reproduce all of the results in our paper: Making Learners (More) Monotone, T J Viering, A Mey, M Loog, IDA 2020.
-
Updated
Apr 23, 2020 - MATLAB
This repository contains the code to reproduce all of the results in our paper: Making Learners (More) Monotone, T J Viering, A Mey, M Loog, IDA 2020.
A rewrite of "The Lack of A Priori Distinctions Between Learning Algorithms" http://www.mitpressjournals.org/doi/abs/10.1162/neco.1996.8.7.1341
A collection of some of my presentations
Code for the paper "Interpolation can hurt robust generalization even when there is no noise" available here: https://papers.nips.cc/paper/2021/hash/c4f2c88e16a579900657c18726641c81-Abstract.html
About Code repository for: Nguyen, H., Nguyen, T., Nguyen, K., & Ho, N. (2024). Towards Convergence Rates for Parameter Estimation in Gaussian-gated Mixture of Experts. In Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, AISTATS 2024, Acceptance rate 27.6% over 1980 submissions.
CSCI 5622 | CU Boulder | Spring 2018 | Chris Ketelsen
Official code for k-experts - Online Policies and Fundamental Limits, AISTATS 2022
- Rademacher complexity - ERM algo on boolean conjunction prediction - online learning
Codebase for "A Bias-Variance-Covariance Decomposition of Kernel Scores for Generative Models", published at ICML 2024.
Training ReLU networks to high uniform accuracy is intractable
A proof of concept C++ application for learning regular languages
Course Design Lab Repository
All codes of my master thesis conducted under computational learning theory
#UAI2020 Codes for PAC-Bayesian Contrastive Unsupervised Representation Learning
🤐
Learning ReLU networks to high uniform accuracy is intractable (ICLR 2023)
Solutions and Codes Example for Assignments of Machine Learning Foundation, Fall 2020, National Taiwan University
Scinis-learn is a package of non-OOP functions for Machine Learning developed by young Moroccan AI engineering students from scratch.
A Python implementation of the Neural Tangent Kernel (jacot et al, 2018)
Official implementation of On-Demand Sampling: Learning Optimally from Multiple Distributions (Neurips 2022)
Add a description, image, and links to the learning-theory topic page so that developers can more easily learn about it.
To associate your repository with the learning-theory topic, visit your repo's landing page and select "manage topics."