Emotional states control for on-line game avatarsFaculty of Informatics - Papers (Archive)
AbstractAlthough detailed animation has already been achieved in a number of Multi-player On-line Games (MOGs), players have to use text commands to control emotional states of avatars. Some systems have been proposed to implement a real-time automatic system facial expression recognition of players. Such systems can then be used to control avatars emotional states by driving the MOG's "animation engine" instead of text commands. Some of the challenges of such systems is the ability to detect and recognize facial components from low spatial resolution face images. In this paper a system based on an improved face detection method of Viola and Jones is proposed to serve the MOGs better. In addition a robust coarse-to-fine facial landmark localization method is proposed. The proposed system is evaluated by testing it on a database different from the training database and achieved 83% recognition rate for 4 emotional state expressions. The system is able to operate over a wider range of distance from user to camera.
Citation InformationCe Zhan, Wanqing Li, Farzad Safaei and Philip Ogunbona. "Emotional states control for on-line game avatars" (2007) p. 31 - 36
Available at: http://works.bepress.com/p_ogunbona/82/