ManooshSamiei / multi-armed-bandit

Implementation of Boltzmann exploration, UCB and Thompson sampling algorithm for a 10-armed Bandit

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

This repository is not active

About

Implementation of Boltzmann exploration, UCB and Thompson sampling algorithm for a 10-armed Bandit


Languages

Language:Jupyter Notebook 100.0%