This repository provides a flexible framework for experimenting with various multi-armed bandit algorithms. It includes implementations of several classic algorithms such as ε-greedy, UCB1, and ...