merritts / mab

Multi-armed bandit and bayesian probability matching

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

merritts/mab Stargazers