Skip to main content
Hand posture analysis for visual-based human-machine interface
Faculty of Engineering - Papers (Archive)
  • Abdoleh Chalechale, University of Wollongong
  • Farzad Safaei, University of Wollongong
  • Golshah Naghdy, University of Wollongong
  • Prashan Premaratne, University of Wollongong
Publication Date
Publication Details

Chalechale, A., Safaei, F., Naghdy, G. & Premaratne, P. (2005). Hand posture analysis for visual-based human-machine interface. In B. Lovell & A. Meader (Eds.), WDIC 2005 APRS Workshop on Digital Image Computing (pp. 91-96). Queensland: The Australian Pattern Recognition Society.

This paper presents a new scheme for hand posture selection and recognition based on statistical classification. It has applications in telemedicine, virtual reality, computer games, and sign language studies. The focus is placed on (1) how to select an appropriate set of postures having a satisfactory level of discrimination power, and (2) comparison of geometric and moment invariant properties to recognize hand postures. We have introduced cluster-property and cluster-features matrices to ease posture selection and to evaluate different posture characteristics. Simple and fast decision functions are derived for classification, which expedite on-line decision making process. Experimental results confirm the efficacy of the proposed scheme where a compact set of geometric features yields a recognition rate of 98:8%.
Link to publisher version (URL)
APRS Workshop on Digital Image Computing
Citation Information
Abdoleh Chalechale, Farzad Safaei, Golshah Naghdy and Prashan Premaratne. "Hand posture analysis for visual-based human-machine interface" (2005)
Available at: