Skip to main content
Presentation
Variance estimation after Kernel Ridge Regression Imputation
Statistics Conference Proceedings, Presentations and Posters
  • Hengfang Wang, Iowa State University
  • Jae Kwang Kim, Iowa State University
Document Type
Presentation
Conference
37th International Conference on Machine Learning
Publication Date
1-1-2020
Conference Title
The Art of Learning with Missing Values
Conference Date
July 17, 2020
Abstract

Imputation is a popular technique for handling missing data. Variance estimation after imputation is an important practical problem in statistics. In this paper, we consider variance estimation of the imputed mean estimator under the kernel ridge regression imputation. We consider a linearization approach which employs the covariate balancing idea to estimate the inverse of propensity scores. The statistical guarantee of our proposed variance estimation is studied when a Sobolev space is utilized to do the imputation, where n-consistency can be obtained. Synthetic data experiments are presented to confirm our theory.

Comments

This paper was presented at the first Workshop on the Art of Learning with Missing Values (Artemiss) hosted by the 37th International Conference on Machine Learning (ICML), July 17, 2020. Posted with permission.

Copyright Owner
The Authors
Language
en
File Format
application/pdf
Citation Information
Hengfang Wang and Jae Kwang Kim. "Variance estimation after Kernel Ridge Regression Imputation" (2020)
Available at: http://works.bepress.com/jae-kwang-kim/70/