Model-Based RL Multi-Tasking with ReLAx
-
Updated
Aug 29, 2022 - Jupyter Notebook
Model-Based RL Multi-Tasking with ReLAx
Pytorch version of Dreamer, which follows the original TF v2 codes.
Various reinforcement learning algorithms implemented on the frozen lake grid world.
Code release for "HarmonyDream: Task Harmonization Inside World Models" (ICML 2024), https://arxiv.org/abs/2310.00344
Official implementation of L4DC 2023 paper Transition Occupancy Matching -Learning Policy-Aware Models for Model-Based Reinforcement Learning via Transition Occupancy Matching
Adaptable tools to make reinforcement learning and evolutionary computation algorithms.
Code for Tackling Long-Horizon Tasks with Model-based Offline Reinforcement Learning
Numerical Evidence for Sample Efficiency of Model-Based over Model-Free Reinforcement Learning Control of Partial Differential Equations [ECC'24]
Personal Deep Reinforcement Learning class notes
Example CEM implementation with ReLAx
This repository offers implementations of classic and deep reinforcement learning algorithms, including dynamic programming, monte carlo methods, td-learning, and also both q-function-based and policy gradient approaches with deep nerual networks.
Simple world models lead to good abstractions, Google Cerebra internship 2020/master thesis at EPFL LCN 2021 ⬛◼️▪️🔦
Fun with Reinforcement Learning in my spare time
An "over-optimistic" effort to read and summarize a Deep Reinforcement Learning based paper a day 🤩 👊
Codes for "Efficient Offline Policy Optimization with a Learned Model", ICLR2023
Master's thesis on model-based intrinsically motivated reinforcement learning in robotic control
Example DYNA-Q implementation with ReLAx
Master Thesis project
Example Random Shooting implementation with ReLAx
VQ-VAE-based image tokenizer for model-based RL
Add a description, image, and links to the model-based-reinforcement-learning topic page so that developers can more easily learn about it.
To associate your repository with the model-based-reinforcement-learning topic, visit your repo's landing page and select "manage topics."