Dopamine is a research framework for fast prototyping of reinforcement learning algorithms.
-
Updated
Nov 4, 2024 - Jupyter Notebook
Dopamine is a research framework for fast prototyping of reinforcement learning algorithms.
An elegant PyTorch deep reinforcement learning library.
ELF: a platform for game research with AlphaGoZero/AlphaZero reimplementation
An implementation of the AlphaZero algorithm for Gomoku (also called Gobang or Five in a Row)
A modular, primitive-first, python-first PyTorch library for Reinforcement Learning.
Reinforcement Learning Coach by Intel AI Lab enables easy experimentation with state of the art Reinforcement Learning algorithms
A training framework for Stable Baselines3 reinforcement learning agents, with hyperparameter optimization and pre-trained agents included.
[ICML 2017] TensorFlow code for Curiosity-driven Exploration for Deep Reinforcement Learning
A collection of 100+ pre-trained RL agents using Stable Baselines, training and hyperparameter optimization included.
Refer to https://github.com/AcutronicRobotics/gym-gazebo2 for the new version
Python library for Reinforcement Learning.
SEED RL: Scalable and Efficient Deep-RL with Accelerated Central Inference. Implements IMPALA and R2D2 algorithms in TF2 with SEED's architecture.
Implementation of papers in 100 lines of code.
[NeurIPS'21 Outstanding Paper] Library for reliable evaluation on RL and ML benchmarks, even with only a handful of seeds.
Hearthstone simulator using C++ with some reinforcement learning
A curated list of Monte Carlo tree search papers with implementations.
Stable-Baselines tutorial for Journées Nationales de la Recherche en Robotique 2019
A curated list of applied machine learning and data science notebooks and libraries across different industries.
Implementation of Inverse Reinforcement Learning (IRL) algorithms in Python/Tensorflow. Deep MaxEnt, MaxEnt, LPIRL
Add a description, image, and links to the rl topic page so that developers can more easily learn about it.
To associate your repository with the rl topic, visit your repo's landing page and select "manage topics."