Code for Undergrad Final Year Project “Offline Risk-Averse Actor-Critic with Curriculum Learning”
-
Updated
Jun 20, 2023 - Python
Code for Undergrad Final Year Project “Offline Risk-Averse Actor-Critic with Curriculum Learning”
オフライン強化学習用フレームワーク及びSCQL,SCQL+Dの実装
Need 4 Speed, FYP 2023-24 @ Monash.
PyTorch Implementation of MOPO
Implemenation of CORL for Fetch and Unitree A1 tasks
Codes accompanying the paper "On the Role of Discount Factor in Offline Reinforcement Learning" (ICML 2022)
Package for recording Transitions in OpenAI Gym Environments.
Author's repository for GSM8K-AI-SubQ reasoning dataset
Code to reproduce experiments from "User-Interactive Offline Reinforcement Learning" (ICLR 2023)
Direct port of TD3_BC to JAX using Haiku and optax.
Code for NeurIPS 2023 paper Accountability in Offline Reinforcement Learning: Explaining Decisions with a Corpus of Examples
PyTorch Implementation of Offline Reinforcement Learning algorithms
Code for Continuous Doubly Constrained Batch Reinforcement Learning, NeurIPS 2021.
Neural Laplace Control for Continuous-time Delayed Systems - an offline RL method combining Neural Laplace dynamics model and MPC planner to achieve near-expert policy performance in environments with irregular time intervals and an unknown constant delay.
Codes accompanying the paper "Offline Reinforcement Learning with Value-Based Episodic Memory" (ICLR 2022 https://arxiv.org/abs/2110.09796)
D2C(Data-driven Control Library) is a library for data-driven control based on reinforcement learning.
Pytorch implementation of state-of-the-art offline reinforcement learning algorithms.
The Official Code for Offline Model-based Adaptable Policy Learning (NeurIPS'21 & TPAMI)
The Medkit-Learn(ing) Environment: Medical Decision Modelling through Simulation (NeurIPS 2021) by Alex J. Chan, Ioana Bica, Alihan Huyuk, Daniel Jarrett, and Mihaela van der Schaar.
code for paper Query-Dependent Prompt Evaluation and Optimization with Offline Inverse Reinforcement Learning
Add a description, image, and links to the offline-rl topic page so that developers can more easily learn about it.
To associate your repository with the offline-rl topic, visit your repo's landing page and select "manage topics."