Skip to main content
Article
Penalized nonparametric scalar-on-function regression via principal coordinates
Journal of Computational and Graphical Statistics (2017)
  • Philip T. Reiss
  • David L. Miller
  • Pei-Shien Wu
  • Wen-Yu Hua
Abstract
A number of classical approaches to nonparametric regression have recently been extended to the case of functional predictors. This paper introduces a new method of this type, which extends intermediate-rank penalized smoothing to scalar-on-function regression. The core idea is to regress the response on leading principal coordinates defined by a relevant distance among the functional predictors, while applying a ridge penalty. Our publicly available implementation, based on generalized additive modeling software, allows for fast optimal tuning parameter selection and for extensions to multiple functional predictors, exponential family-valued responses, and mixed-effects models. In an application to signature verification data, the proposed principal coordinate ridge regression is shown to outperform a functional generalized linear model.
Keywords
  • dynamic time warping,
  • functional regression,
  • generalized additive model,
  • kernel ridge regression,
  • multidimensional scaling
Publication Date
2017
Citation Information
Philip T. Reiss, David L. Miller, Pei-Shien Wu and Wen-Yu Hua. "Penalized nonparametric scalar-on-function regression via principal coordinates" Journal of Computational and Graphical Statistics (2017)
Available at: http://works.bepress.com/phil_reiss/42/