Bandit algorithms
-
Updated
Oct 12, 2017 - Python
Bandit algorithms
Yahoo! news article recommendation system by linUCB
Python implementation of UCB, EXP3 and Epsilon greedy algorithms
Ads Click-through rate using thompson sampling
AI Reinforcement Learning in Python
DPE code - Code used in "Optimal Algorithms for Multiplayer Multi-Armed Bandits" (AISTATS 2020)
An illustrative project including some multi-armed bandit algorithms and contextual bandit algorithms
Bandit and Evolutionary Algorithms using Python
Personal reimplementation of some ML algorithms for learning purposes
Python utilities to compute a lower bound of the expected sample complexity to identify the best arm in a bandit model
Python implementation for Reinforcement Learning algorithms -- Bandit algorithms, MDP, Dynamic Programming (value/policy iteration), Model-free Control (off-policy Monte Carlo, Q-learning)
This is an implementation of the Reinforcement Learning multi-arm-bandit experiment using different exploration techniques.
This repository aims at learning most popular MAB and CMAB algorithms and watch how they run. It is interesting for those wishing to start learning these topics.
A benchmark to test decision-making algorithms for contextual-bandits. The library implements a variety of algorithms (many of them based on approximate Bayesian Neural Networks and Thompson sampling), and a number of real and syntethic data problems exhibiting a diverse set of properties.
Python library of bandits and RL agents in different real-world environments
A short implementation of bandit algorithms - ETC, UCB, MOSS and KL-UCB
Implementation for NeurIPS 2020 paper "Locally Differentially Private (Contextual) Bandits Learning" (https://arxiv.org/abs/2006.00701)
This repo contains code for multi-armed bandit algorithm testing and local multiplayer competition.
Add a description, image, and links to the bandit-algorithms topic page so that developers can more easily learn about it.
To associate your repository with the bandit-algorithms topic, visit your repo's landing page and select "manage topics."