Dan Roy: Admissibility is Bayes Optimality with Infinitesimals

Abstract

We give an exact characterization of admissibility in statistical decision problems in terms of Bayes optimality in a so-called nonstandard extension of the original decision problem, as introduced by Duanmu and Roy. Unlike the consideration of improper priors or other generalized notions of Bayes optimalitiy, the nonstandard extension is distinguished, in part, by having priors that can assign "infinitesimal" mass in a sense that can be made rigorous using results from nonstandard analysis. With these additional priors, we find that, informally speaking, a decision procedure δ0 is admissible in the original statistical decision problem if and only if, in the nonstandard extension of the problem, the nonstandard extension of δ0 is Bayes optimal among the (extensions of) standard decision procedures with respect to a nonstandard prior that assigns at least infinitesimal mass to every standard parameter value. We use the above theorem to give further characterizations of admissibility, one related to Blyth's method, one to a condition due to Stein which characterizes admissibility under some regularity assumptions; and finally, a characterization using finitely additive priors in decision problems meeting certain regularity requirements. Our results imply that Blyth's method is a sound and complete method for establishing admissibility. Buoyed by this result, we revisit the univariate two-sample common-mean problem, and show that the Graybill--Deal estimator is admissible among a certain class of unbiased decision procedures. Joint work with Haosui Duanmu (HIT) and David Schrittesser (Toronto).

Bio

Marc Deisenroth
Marc Deisenroth
Google DeepMind Chair of Machine Learning and Artificial Intelligence