Hazri, Muhammad Hazmirul Haqim (2024) Analysis of fingerspelling recognition prototype accuracy using sensor-based signal. Project Report. Universiti Teknikal Malaysia Melaka, Melaka, Malaysia. (Submitted)
![]() |
Text (Full Text)
Analysis of fingerspelling recognition prototype accuracy using sensor-based signal.pdf Download (3MB) |
Abstract
Individuals with hearing and speech impairments often rely on sign language, which use specific hand gestures combined with facial expressions to communicate. These gestures are precise movements of the hands, forming the basis of sign languages used by the deaf community. Current research aims to translate these hand gestures into spoken language using electronic devices, bridging the communication gap between deaf and hearing individuals. This study presents a prototype glove equipped with one accelerometer and five flex sensors, designed to recognize and convert fingerspelling into an alphabet. The glove is fitted with flex sensors along the length of each finger capturing the movements and bends of the fingers. These sensors measure changes in resistance caused by finger bends, while the accelerometer tracks the hand’s movement for the alphabet "J" and "Z". The collected data is processed by a microcontroller, which then transmits the information to an OLED display. This system allows deaf individuals to make various hand gestures while wearing the glove, which are then translated into the corresponding alphabet for easy recognition by hearing individuals. The flex sensors play a crucial role in accurately detecting finger movements. The integration of flex sensor data with accelerometer readings provides a comprehensive understanding of hand gestures, enabling precise conversion into alphabet. This project demonstrates the potential of sensor-based gloves to facilitate communication for the deaf and hard-of-hearing communities, enhancing their ability to interact with the hearing world. The accuracy of this project's first characterization for user-based is 62% and alphabet-based is 63%. For final characterization after improvement from the first characterization for user-based is 76% and alphabet-based also 76%.
Item Type: | Final Year Project (Project Report) |
---|---|
Uncontrolled Keywords: | Hand gesture, Flex sensor, OLED display |
Subjects: | T Technology > T Technology (General) T Technology > TK Electrical engineering. Electronics Nuclear engineering |
Divisions: | Library > Final Year Project > FTKEK |
Depositing User: | Norfaradilla Idayu Ab. Ghafar |
Date Deposited: | 21 Oct 2024 06:52 |
Last Modified: | 20 Nov 2024 00:30 |
URI: | http://digitalcollection.utem.edu.my/id/eprint/33799 |
Actions (login required)
![]() |
View Item |