Clean and flexible implementation of PPO (built on top of stable-baselines3)
-
Updated
Jul 9, 2021 - Python
Clean and flexible implementation of PPO (built on top of stable-baselines3)
An implementation of Proximal Policy Optimization using TensorFlow. Tested on the OpenAI Gym car racing environment.
Training PPO agents in OpenAI Gym and PyBullet environments.
Implement PPO to solve Crawler problem in Unity
An implementation from the state-of-the-art family of reinforcement learning algorithms Proximal Policy Optimization using normalized Generalized Advantage Estimation and optional batch mode training. The loss function incorporates an entropy bonus.
A pytorch project to easily run experiments on OpenAI's Procgen Benchmark
Modular Deep RL infrastructure in PyTorch
Training a PPO to balance a pendulum in a fully observable environment.
AI agent learns to walk, run, hop and crawl with out any given data using proximal policy optimisation.
A repo with a MultiProcessing class for Gym Reinforcement Learning Environments
Best Agents in a Simplified Environment called "Naive Bandori". Agents with or without audio inputs are both available.
SimplyPPO replicates Proximal-Policy-Optimization with minimum (~250) lines of code in clean, readable PyTorch style, while trying to use as few additional tricks and hyper-parameters as possible (PyBullet benchmarks included).
Implementation of PPO with TF 2.0 and Pyoneer.
The CAT Optimal Hybrid Solver is a tool designed to tackle the cross array task (CAT) activity designed to assess algorithmic thinking skills in the context of K-12 education.
Training a Reinforcement Learning Agent to Play Flappy Bird.
Implementations of deep reinforcement learning algorithms.
JAX Implementation of Proximal Policy Optimisation Algorithm
Nabi Deep Reinforcement Learning with PPO
Add a description, image, and links to the proximal-policy-optimization topic page so that developers can more easily learn about it.
To associate your repository with the proximal-policy-optimization topic, visit your repo's landing page and select "manage topics."