reward simulator for contextual bandits
-
Updated
Dec 20, 2018 - Perl
reward simulator for contextual bandits
Programming assignments of CS747 - Reinforcement Learning IIT-B
[Python] 4 multi-armed bandit algorithms are implemented to determine which one can most effectively determine the best website configuration that maximise signups.
This repository contains the code necessary for generating the figures presented in the paper titled "Cooperative Thresholded Lasso for Sparse Linear Bandit".
An introduction to multi arm bandits
This is an A/B testing project that was made to see if a new version of a sign up button in a website is better than current one.
Multi-Stage-Multi-Armed Bandits (MAB) are a class of reinforcement learning problems where an agent tries to maximize its cumulative reward by sequentially selecting actions from multiple options (arms) and observing the rewards associated with those actions.
Thompson Sampling equipped with Goodness of Fit test based active change-point detection in Non-Stationary Bandit environment
Batched Multi-armed Bandits Problem - Analisi critica. Artificial Intelligence Course Project on the study and experimental results' analysis of a scientific paper.
MBIT Big Data 2019-2020 Reinforced Learning (DC-03 TP-01)
Sending personalized marketing offers (called free play in a casino setting) to players by observing data on their gaming behavior and demographic information
Real-time decision tool for A/B-testing based on multi-armed bandit algorithm
Our project for the "Data Intelligence Applications" exam at Politecnico di Milano. The project was about Social Influence and Pricing online learning techniques applied to networks.
Multi-armed bandit problem in Reinforcement learning
A julia package to compute Gittins Indices for Multi Armed Bandits
Applying anomaly detection methods on Multi-Armed Bandit problems
Analysis of various multi armed bandit algorithms over normal and heavy-tailed distributions.
We show performance of various algorithms in semi-bandit setting and try to solve a real word problem using the same
Add a description, image, and links to the multiarmed-bandits topic page so that developers can more easily learn about it.
To associate your repository with the multiarmed-bandits topic, visit your repo's landing page and select "manage topics."