Big data features not only large volumes of data but also data with complicated structures. Complexity imposes unique challenges on big data analytics. Meeker and Hong (2014; Quality Engineering, pp. 102–16) provided an extensive discussion of the opportunities and challenges on big data and reliability; they also described engineering systems which generate big data that can be used in reliability analysis. Meeker and Hong (2014) focused on large-scale system operating and environment data (i.e., high-frequency multivariate time series data) and provided examples on how to link such data as covariates to traditional reliability responses such as time to failure, time to recurrence of events, and degradation measurements. This article intends to extend that discussion by focusing on how to use data with complicated structures to do reliability analysis. Such data types include high-dimensional sensor data, functional curve data, and image streams. We first provide a review of recent developments in those directions, then we provide a discussion on how analytical methods can be developed to tackle the challenging aspects that arise from the complex features of big data in reliability applications. The use of modern statistical methods such as variable selection, functional data analysis, scalar-on-image regression, spatio-temporal data models, and machine-learning techniques will also be discussed.
Available at: http://works.bepress.com/wqmeeker/157/