Skip to main content
Article
Resampling-Based Information Criteria for Best-Subset Regression
Annals of the Institute of Statistical Mathematics (2012)
  • Philip T. Reiss, New York University
  • Lei Huang, Columbia University
  • Joseph E. Cavanaugh, University of Iowa
  • Amy Krain Roy, Fordham University
Abstract

When a linear model is chosen by searching for the best subset among a set of candidate predictors, a fixed penalty such as that imposed by the Akaike information criterion may penalize model complexity inadequately, leading to biased model selection. We study resampling-based information criteria that aim to overcome this problem through improved estimation of the effective model dimension. The first proposed approach builds upon previous work on bootstrap-based model selection. We then propose a more novel approach based on cross-validation. Simulations and analyses of a functional neuroimaging data set illustrate the strong performance of our resampling-based methods, which are implemented in a new R package.

Keywords
  • Adaptive model selection; Covariance inflation criterion,
  • Extended information criterion,
  • Functional connectivity,
  • Pluralistic model selection,
  • Subsampling
Publication Date
2012
Publisher Statement
Official journal site: http://www.ism.ac.jp/editsec/aism-e.html
Citation Information
Philip T. Reiss, Lei Huang, Joseph E. Cavanaugh and Amy Krain Roy. "Resampling-Based Information Criteria for Best-Subset Regression" Annals of the Institute of Statistical Mathematics Vol. 64 Iss. 6 (2012)
Available at: http://works.bepress.com/phil_reiss/17/