Randomized Greedy Learning Under Full-bandit Feedback
-
Updated
Jan 22, 2024 - Python
Randomized Greedy Learning Under Full-bandit Feedback
🐯REPLICA of "Auction-based combinatorial multi-armed bandit mechanisms with strategic arms"
PyXAB - A Python Library for X-Armed Bandit and Online Blackbox Optimization Algorithms
Pricing and advertising strategy for the e-commerce of an airline company, based on Multi-Armed Bandits (MABs) algorithms and Gaussian Processes. Simulations include non-stationary environments.
Multi-Objective Multi-Armed Bandit
Today I Learned - Reinforcement Learning
Reinforcement Learning Starters Package for Multi-arm Bandits Problem
Non-stationary Bandits and Meta-Learning with a Small Set of Optimal Arms
Homework Code for UCLA STATS 115 (Probabilistic Decision Making) Fall 22 Offering
This repo contains code for multi-armed bandit algorithm testing and local multiplayer competition.
Implementation for NeurIPS 2020 paper "Locally Differentially Private (Contextual) Bandits Learning" (https://arxiv.org/abs/2006.00701)
A short implementation of bandit algorithms - ETC, UCB, MOSS and KL-UCB
Python library of bandits and RL agents in different real-world environments
A benchmark to test decision-making algorithms for contextual-bandits. The library implements a variety of algorithms (many of them based on approximate Bayesian Neural Networks and Thompson sampling), and a number of real and syntethic data problems exhibiting a diverse set of properties.
This repository aims at learning most popular MAB and CMAB algorithms and watch how they run. It is interesting for those wishing to start learning these topics.
This is an implementation of the Reinforcement Learning multi-arm-bandit experiment using different exploration techniques.
Python implementation for Reinforcement Learning algorithms -- Bandit algorithms, MDP, Dynamic Programming (value/policy iteration), Model-free Control (off-policy Monte Carlo, Q-learning)
Python utilities to compute a lower bound of the expected sample complexity to identify the best arm in a bandit model
Add a description, image, and links to the bandit-algorithms topic page so that developers can more easily learn about it.
To associate your repository with the bandit-algorithms topic, visit your repo's landing page and select "manage topics."