Skip to content
@SMPyBandits

SMPyBandits

Open-Source Python package for Single- and Multi-Players multi-armed Bandits algorithms

Popular repositories

  1. 🔬 Research Framework for Single and Multi-Players 🎰 Multi-Arms Bandits (MAB) Algorithms, implementing all the state-of-the-art algorithms for single-player (UCB, KL-UCB, Thompson...) and multi-play…

    Jupyter Notebook 286 48

  2. Using the Airspeed Velocity tool (https://asv.readthedocs.io/) to benchmark SMPyBandits (https://github.com/SMPyBandits/SMPyBandits/)

    Python 3

  3. Write-only repository that hosts the documentation for "Open-Source Python package for Single- and Multi-Players multi-armed Bandits algorithms" (SMPyBandits).

    HTML 1

Repositories

Top languages

Loading…

Most used topics

Loading…