Australian sign language recognition
Holden, E-J, Lee, G. and Owens, R. (2005) Australian sign language recognition. Machine Vision and Applications, 16 (5). pp. 312-320.
*Subscription may be required
This paper presents an automatic Australian sign language (Auslan) recognition system, which tracks multiple target objects (the face and hands) throughout an image sequence and extracts features for the recognition of sign phrases. Tracking is performed using correspondences of simple geometrical features between the target objects within the current and the previous frames. In signing, the face and a hand of a signer often overlap, thus the system needs to segment these for the purpose of feature extraction. Our system deals with the occlusion of the face and a hand by detecting the contour of the foreground moving object using a combination of motion cues and the snake algorithm. To represent signs, features that are invariant to scaling, 2D rotations and signing speed are used for recognition. The features represent the relative geometrical positioning and shapes of the target objects, as well as their directions of motion. These are used to recognise Auslan phrases using Hidden Markov Models. Experiments were conducted using 163 test sign phrases with varying grammatical formations. Using a known grammar, the system achieved over 97% recognition rate on a sentence level and 99% success rate at a word level.
|Publication Type:||Journal Article|
|Murdoch Affiliation:||School of Engineering Science|
|Item Control Page|