Econometrica

Journal Of The Econometric Society

An International Society for the Advancement of Economic
Theory in its Relation to Statistics and Mathematics

Edited by: Marina Halac • Print ISSN: 0012-9682 • Online ISSN: 1468-0262

Econometrica: May, 2025, Volume 93, Issue 3

Risk and Optimal Policies in Bandit Experiments

https://doi.org/10.3982/ECTA21075
p. 1003-1029

Karun Adusumilli

We provide a decision‐theoretic analysis of bandit experiments under local asymptotics. Working within the framework of diffusion processes, we define suitable notions of asymptotic Bayes and minimax risk for these experiments. For normally distributed rewards, the minimal Bayes risk can be characterized as the solution to a second‐order partial differential equation (PDE). Using a limit of experiments approach, we show that this PDE characterization also holds asymptotically under both parametric and non‐parametric distributions of the rewards. The approach further describes the state variables it is asymptotically sufficient to restrict attention to, and thereby suggests a practical strategy for dimension reduction. The PDEs characterizing minimal Bayes risk can be solved efficiently using sparse matrix routines or Monte Carlo methods. We derive the optimal Bayes and minimax policies from their numerical solutions. These optimal policies substantially dominate existing methods such as Thompson sampling; the risk of the latter is often twice as high.


Full Content

Supplemental Material

Supplement to "Risk and Optimal Policies in Bandit Experiments"

Karun Adusumilli

This supplement contains material not found within the manuscript.

Supplement to "Risk and Optimal Policies in Bandit Experiments"

Karun Adusumilli

The replication package for this paper is available at https://doi.org/10.5281/zenodo.13776876. The Journal checked the data and codes included in the package for their ability to reproduce the results in the paper and approved online appendices.
 

Member Comments on "Risk and Optimal Policies in Bandit Experiments"

Log In To Submit A Comment