ε-first policies for budget-limited multi-armed bandits

Title: ε-first policies for budget-limited multi-armed bandits
Author(s): Tran-Thanh, L
Chapman, A
De Cote, EM
Rogers, A
Jennings, NR
Item Type: Conference Paper
Abstract: We introduce the budget-limited multi-armed bandit (MAB), which captures situations where a learners actions are costly and constrained by a fixed budget that is incommensurable with the rewards earned from the bandit machine, and then describe a first algorithm for solving it. Since the learner has a budget, the problems duration is finite. Consequently an optimal exploitation policy is not to pull the optimal arm repeatedly, but to pull the combination of arms that maximises the agents total reward within the budget. As such, the rewards for all arms must be estimated, because any of them may appear in the optimal combination. This difference from existing MABs means that new approaches to maximising the total reward are required. To this end, we propose an ε-first algorithm, in which the first ε of the budget is used solely to learn the arms rewards (exploration), while the remaining I - ε is used to maximise the received reward based on those estimates (exploitation). We derive bounds on the algorithms loss for generic and uniform exploration methods, and compare its performance with traditional MAB algorithms under various distributions of rewards and costs, showing that it outperforms the others by up to 50%. Copyright © 2010, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
Publication Date: 1-Nov-2010
URI: http://hdl.handle.net/10044/1/36591
ISBN: 9781577354659
Start Page: 1211
End Page: 1216
Journal / Book Title: Proceedings of the National Conference on Artificial Intelligence
Volume: 2
Publication Status: Published
Open Access location: http://eprints.soton.ac.uk/id/eprint/270806
Appears in Collections:Faculty of Engineering



Items in Spiral are protected by copyright, with all rights reserved, unless otherwise indicated.

Creative Commons