Skip to content

Implementation of the paper "Towards mental time travel: a hierarchical memory for reinforcement learning agent" using CleanRL

License

Notifications You must be signed in to change notification settings

jinPrelude/hcam-torch

Repository files navigation

hcam-torch

Implementation of the paper Towards mental time travel: a hierarchical memory for reinforcement learning agents using CleanRL.

Contributors

Euijin Jeong
Euijin Jeong

💻
Ownfos
Ownfos

💻

This project follows the all-contributors specification.

Progress

Get Started

We tested on Python 3.10.

We recommand to use Anaconda(or Miniconda) to run in virtual environment.

conda create -n hcam python=3.10 -y
conda activate hcam

Clone the repo:

git clone https://github.com/jinPrelude/hcam-torch.git
cd hcam-torch

Install poetry and run poetry install. This will install all dependencies we need.

pip install poetry && poetry install

If the installation completed, try ballet_lstm_lang_only.py

# --track for logging wandb.ai
python ballet_lstm_lang_only --track

Benchmark

LSTM Agent

Tested on i9-11900k + RTX 3090 :

Playing BalletEnv (2_delay16_easy)
Trained in ≈15M total frames Trained in ≈3H

About

Implementation of the paper "Towards mental time travel: a hierarchical memory for reinforcement learning agent" using CleanRL

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages