Character recognition through LIP reading
By: Altejar, Rachelle M
Publisher: Cebu City ; CIT-U ; 2011DDC classification: T Al792 2011 Summary: Every human being with vision uses lip reading to remove the perception of speech. Recognizing characters is being studied and thereby implemented by different researchers through different approaches. This paper describes a real time approach of recognizing characters based on lip-reading to provide an efficient way of communicating for hearing impaired people and to contribute another technique of character recognition. In this study, we first extract the speaker?s lip contours by HSLFiltering from Aforge together with Median filtering that will eliminate noise. A will then be drawn passing through the contour points through Cubic Spline Interpolation. Lips are extracted from the human face that will be captured using a web camera and consist of shape parameters which describe the lip boundary. The use of neural network called Back Propagation Model is implemented for letter recognition. This study limits only to recognizing vowel letters (A,E,I,O,U) using lip-reading.Item type | Current location | Home library | Call number | Status | Date due | Barcode | Item holds |
---|---|---|---|---|---|---|---|
![]() |
COLLEGE LIBRARY | COLLEGE LIBRARY | T Al792 2011 (Browse shelf) | Available | T1657 |
Browsing COLLEGE LIBRARY Shelves Close shelf browser
Every human being with vision uses lip reading to remove the perception of speech. Recognizing characters is being studied and thereby implemented by different researchers through different approaches. This paper describes a real time approach of recognizing characters based on lip-reading to provide an efficient way of communicating for hearing impaired people and to contribute another technique of character recognition. In this study, we first extract the speaker?s lip contours by HSLFiltering from Aforge together with Median filtering that will eliminate noise. A will then be drawn passing through the contour points through Cubic Spline Interpolation. Lips are extracted from the human face that will be captured using a web camera and consist of shape parameters which describe the lip boundary. The use of neural network called Back Propagation Model is implemented for letter recognition. This study limits only to recognizing vowel letters (A,E,I,O,U) using lip-reading.
000-099
There are no comments for this item.