Skip to main content
Presentation
Emotional states control for on-line game avatars
Faculty of Informatics - Papers (Archive)
  • Ce Zhan, University of Wollongong
  • Wanqing Li, University of Wollongong
  • Farzad Safaei, University of Wollongong
  • Philip Ogunbona, University of Wollongong
RIS ID
22296
Publication Date
1-1-2007
Publication Details

Zhan, C., Li, W., Safaei, F. & Ogunbona, P. (2007). Emotional states control for on-line game avatars. Conference on Applications, Technologies, Architectures, and Protocols for Computer CommunicationsNetGames '07 Proceedings of the 6th ACM SIGCOMM workshop on Network and system support for games (pp. 31-36). Melbourne, Australia: ACM.

Abstract
Although detailed animation has already been achieved in a number of Multi-player On-line Games (MOGs), players have to use text commands to control emotional states of avatars. Some systems have been proposed to implement a real-time automatic system facial expression recognition of players. Such systems can then be used to control avatars emotional states by driving the MOG's "animation engine" instead of text commands. Some of the challenges of such systems is the ability to detect and recognize facial components from low spatial resolution face images. In this paper a system based on an improved face detection method of Viola and Jones is proposed to serve the MOGs better. In addition a robust coarse-to-fine facial landmark localization method is proposed. The proposed system is evaluated by testing it on a database different from the training database and achieved 83% recognition rate for 4 emotional state expressions. The system is able to operate over a wider range of distance from user to camera.
Citation Information
Ce Zhan, Wanqing Li, Farzad Safaei and Philip Ogunbona. "Emotional states control for on-line game avatars" (2007) p. 31 - 36
Available at: http://works.bepress.com/p_ogunbona/82/