A hyperparameter optimization framework, inspired by Optuna.
-
Updated
May 16, 2024 - Go
A hyperparameter optimization framework, inspired by Optuna.
🔬 Research Framework for Single and Multi-Players 🎰 Multi-Arms Bandits (MAB) Algorithms, implementing all the state-of-the-art algorithms for single-player (UCB, KL-UCB, Thompson...) and multi-player (MusicalChair, MEGA, rhoRand, MCTop/RandTopM etc).. Available on PyPI: https://pypi.org/project/SMPyBandits/ and documentation on
Repository contains codes for the course CS780: Deep Reinforcement Learning
Some algorithms of Reinforcement Learning implemented by me, in accordance to "Introduction to Reinforcement Learning" by Richard Sutton and Andrew Barto.
Several multi-armed bandit strategies with additional holding option for smoother exploration.
Randomized Greedy Learning Under Full-bandit Feedback
This is a repo for research proposal of Du Junye
🐯REPLICA of "Auction-based combinatorial multi-armed bandit mechanisms with strategic arms"
Official repository for Reinforcement Learning Decoders used for intra-cortical brain machine interfaces - IEEE TNNLS 2023
Repository containing basic algorithm applied in python.
2024 ICML Official code
PyXAB - A Python Library for X-Armed Bandit and Online Blackbox Optimization Algorithms
Pricing and advertising strategy for the e-commerce of an airline company, based on Multi-Armed Bandits (MABs) algorithms and Gaussian Processes. Simulations include non-stationary environments.
Research about Causality-based Reinforcement Learning. This repository includes all needed fundamentals, summary of past work and some most recent development
This is a collection of interesting papers that I have read so far or want to read. Note that the list is not up-to-date. Topics: reinforcement learning, deep learning, mathematics, statistics, bandit algorithms, optimization.
Reinforcement Learning (COMP 579) Project
Multi-Objective Multi-Armed Bandit
Implementation of greedy, ε-greedy and softmax methods for n-armed bandit problem
A small collection of Bandit algorithms, written in Rust 🦀.
Add a description, image, and links to the bandit-algorithms topic page so that developers can more easily learn about it.
To associate your repository with the bandit-algorithms topic, visit your repo's landing page and select "manage topics."