Skip to content
#

ray

Here are 106 public repositories matching this topic...

edoakes
edoakes commented Sep 10, 2020

I often run into issues by accidentally starting a new cluster when one is already running. Then, this later causes problems when I try to connect and there are two running clusters. I'm then forced to ray stop both clusters and ray start my new one again.

My workflow would be improved if I just got an error when trying to start the second cluster and knew to immediately tear down the exist

A custom MARL (multi-agent reinforcement learning) environment where multiple agents trade against one another (self-play) in a zero-sum continuous double auction. Ray [RLlib] is used for training.

  • Updated Jul 22, 2020
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the ray topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the ray topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.