Abstract
A sign language recognition system is required to use information from both global features, such as hand movement and location, and local features, such as hand shape and orientation. In this paper, we present an adequate local feature recognizer for a sign language recognition system. Our basic approach is to represent the hand images extracted from sign-language images as symbols which correspond to clusters by a clustering technique. The clusters are created from a training set of extracted hand images so that a similar appearance can be classified into the same cluster on an eigenspace. The experimental results indicate that our system can recognize a sign language word even in two-handed and hand-to-hand contact cases.
Original language | English |
---|---|
Pages (from-to) | 849-853 |
Number of pages | 5 |
Journal | Proceedings - International Conference on Pattern Recognition |
Volume | 15 |
Issue number | 4 |
Publication status | Published - 2000 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition