PyTorch version of Stable Baselines, reliable implementations of reinforcement learning algorithms.
-
Updated
Mar 23, 2023 - Python
PyTorch version of Stable Baselines, reliable implementations of reinforcement learning algorithms.
A fork of OpenAI Baselines, implementations of reinforcement learning algorithms
High-quality single file implementation of Deep Reinforcement Learning algorithms with research-friendly features (PPO, DQN, C51, DDPG, TD3, SAC, PPG)
Self hosted FLOSS fitness/workout, nutrition and weight tracker written with Django
A standard API for single-agent reinforcement learning environments, with popular reference environments and related utilities (formerly Gym)
Simple and easily configurable grid world environments for reinforcement learning
A standard API for multi-agent reinforcement learning environments, with popular reference environments and related utilities
A training framework for Stable Baselines3 reinforcement learning agents, with hyperparameter optimization and pre-trained agents included.
A collection of 100+ pre-trained RL agents using Stable Baselines, training and hyperparameter optimization included.
Asynchronous Advantage Actor-Critic (A3C) algorithm for Super Mario Bros
Proximal Policy Optimization (PPO) algorithm for Super Mario Bros
Deepdrive is a simulator that allows anyone with a PC to push the state-of-the-art in self-driving
C++-based high-performance parallel environment execution engine (vectorized env) for general RL environments.
Source codes for the book "Reinforcement Learning: Theory and Python Implementation"
High-quality single-file implementations of SOTA Offline RL algorithms: AWAC, BC, CQL, DT, EDAC, IQL, SAC-N, TD3+BC, LB-SAC
S-RL Toolbox: Reinforcement Learning (RL) and State Representation Learning (SRL) for Robotics
Simple A3C implementation with pytorch + multiprocessing
Add a description, image, and links to the gym topic page so that developers can more easily learn about it.
To associate your repository with the gym topic, visit your repo's landing page and select "manage topics."