Skip to main content
Presentation
Fusing Self-Reported and Sensor Data from Mixed-Reality Training
Industrial and Manufacturing Systems Engineering Conference Proceedings and Posters
  • Trevor Richardson, Iowa State University
  • Stephen B. Gilbert, Iowa State University
  • Joseph Holub, Iowa State University
  • Frederick Thompson, Iowa State University
  • Anastacia MacAllister, Iowa State University
  • Rafael Radkowski, Iowa State University
  • Eliot Winer, Iowa State University
  • Paul Davies, The Boeing Company
  • Scott Terry, The Boeing Company
Document Type
Conference Proceeding
Publication Version
Published Version
Publication Date
1-1-2014
Conference Title
Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2014
Conference Date
December 1-4, 2014
Geolocation
(28.5383355, -81.37923649999999)
Abstract

Military and industrial use of smaller, more accurate sensors are allowing increasing amounts of data to be acquired at diminishing costs during training. Traditional human subject testing often collects qualitative data from participants through self-reported questionnaires. This qualitative information is valuable but often incomplete to assess training outcomes. Quantitative information such as motion tracking data, communication frequency, and heart rate can offer the missing pieces in training outcome assessment. The successful fusion and analysis of qualitative and quantitative information sources is necessary for collaborative, mixed-reality, and augmented-reality training to reach its full potential. The challenge is determining a reliable framework combining these multiple types of data. Methods were developed to analyze data acquired during a formal user study assessing the use of augmented reality as a delivery mechanism for digital work instructions. A between-subjects experiment was conducted to analyze the use of a desktop computer, mobile tablet, or mobile tablet with augmented reality as a delivery method of these instructions. Study participants were asked to complete a multi-step technical assembly. Participants’ head position and orientation were tracked using an infrared tracking system. User interaction in the form of interface button presses was recorded and time stamped on each step of the assembly. A trained observer took notes on task performance during the study through a set of camera views that recorded the work area. Finally, participants each completed pre and post-surveys involving self-reported evaluation. The combination of quantitative and qualitative data revealed trends in the data such as the most difficult tasks across each device, which would have been impossible to determine from self-reporting alone. This paper describes the methods developed to fuse the qualitative data with quantified measurements recorded during the study.

Comments

This proceeding is published as Richardson T., Gilbert S., Holub, J., Thompson, F., MacAllister, A., Radkowski, R., Winer, E., Davies, P., Terry, S. (2014) "Fusing Self-Reported and Sensor Data from Mixed-Reality Training", The Interservice/Industry Training, Simulation & Education Conference (I/ITSEC). Paper No. 14158. Posted with permission.

Copyright Owner
The Authors
Language
en
File Format
application/pdf
Citation Information
Trevor Richardson, Stephen B. Gilbert, Joseph Holub, Frederick Thompson, et al.. "Fusing Self-Reported and Sensor Data from Mixed-Reality Training" Orlando, FL(2014)
Available at: http://works.bepress.com/stephen_b_gilbert/51/