Abstract
People with speech disorders often communicate through special gestures and sign language gestures. However, other people around them might not understand the meaning of those gestures. The research described in this article is aimed at providing an assistive device to help those people communicate with others by translating their gestures into a spoken voice that others can understand. The proposed device includes an electronic glove that is worn on the hand. It employs an MPU6050 accelerometer/gyro with 6 degrees of freedom to continuously monitor hand orientation and movement, plus a potentiometer for each finger, to monitor changes in finger posture. The signals from the MPU6050 and the potentiometers are routed to an Arduino board, where they are processed to determine the meaning of each gesture, which is then voiced using the audio streams stored in an SD memory card. The audio output drives a speaker, allowing the listener to understand the meaning of each gesture. We built a database with the help of 10 deaf people who cannot speak. We asked them to wear the glove while performing a set of 40 Arabic sign language words and recorded the resulting data stream from the glove. That data was then used to train seven different learning algorithms. The results showed that the Decision Tree learning algorithm achieved the highest accuracy of 98%. A usability study was then conducted to determine the usefulness of the assistive device in real-time.
- [1] 2017. Real time translator for sign languages. In International Conference on Frontiers of Information Technology (FIT).Google Scholar
- [2] . 2014. Sign language recognition system to aid deaf-dumb people using PCA. Int. J. Comput. Sci. Eng. Technol. 5, 05 (2014).Google Scholar
- [3] . 2017. Development of full duplex intelligent communication system for deaf and dumb people. In 7th International Conference on Cloud Computing, Data Science & Engineering-Confluence. IEEE, 2017.Google Scholar
Cross Ref
- [4] . 2013. Design and development of Tamil sign alphabets using image processing with right hand palm to aid deaf-dumb people IETE J. Res. 59, 6 (2013), 709–718.Google Scholar
Cross Ref
- [5] 2015. Image processing based language converter for deaf and dumb people. IOSR J. Electron. Commun. Eng. e-ISSN 1, 2015 (2015), 2278–2834.Google Scholar
- [6] 2015. Two way communicator between deaf and dumb people and normal people. In International Conference on Computing Communication Control and Automation. IEEE, 2015.Google Scholar
Digital Library
- [7] . 2010. Electronic speaking glove for speechless patients, a tongue to a dumb. In IEEE Conference on Sustainable Utilization and Development in Engineering and Technology. IEEE, 2010.Google Scholar
Cross Ref
- [8] . 2002. Sign language recognition using sensor gloves. In 9th International Conference on Neural Information Processing.Google Scholar
Cross Ref
- [9] 2016. Electronic hand glove through gestures for verbally challenged persons. Int. Journal of Engineering Research and Applications 6, 4 (Part - 1) (2016), 17--19.Google Scholar
- [10] . 2014. American Sign Language recognition using leap motion sensor. In 13th International Conference on Machine Learning and Applications.Google Scholar
Digital Library
- [11] 2010. V-Braille: Haptic Braille perception using a touch-screen and vibration on mobile phones. In 12th International ACM SIGACCESS Conference on Computers and Accessibility.Google Scholar
Digital Library
- [12] . 2013. Electronic speaking glove for speechless patients a regional tongue to a dumb. International Journal of Electrical and Computer Engineering 4, 2 (2013), 507–511.Google Scholar
- [13] . 2014. Speaking gloves for speechless persons. International Journal of Innovative Research in Science, Engineering and Technology 3, 4 (2014).Google Scholar
- [14] . 2016. Smart speaking glove-virtual tongue for deaf and dumb. Int. J. Adv. Res. Electric. Electron. Instrum. Eng. 5, 3 (2016), 7.Google Scholar
- [15] . 2010. Electronic speaking glove for speechless patients, a tongue to a dumb. In IEEE Conference on Sustainable Utilization and Development in Engineering and Technology.Google Scholar
Cross Ref
- [16] . 2017. Flex sensor based hand glove for deaf and mute people. Int. J. Comput. Netw. Commun. Secur. 5, 2 (2017), 38.Google Scholar
- [17] 2003. Using multiple sensors for mobile sign language recognition. In Proceeding of the 7th IEEE International Symposium on Wearable Computers (ISWC'03), Georgia Institute of Technology, White Plains, New York.Google Scholar
- [18] 2010. V-Braille: Haptic Braille perception using a touch-screen and vibration on mobile phones. In 12th International ACM SIGACCESS Conference on Computers and Accessibility.Google Scholar
Digital Library
- [19] . 2015. Talking glove—A boon for the deaf, dumb and physically challenged. Int. J. Adv. Res. Electron. Commun. Eng. 4, 5 (2015), 1366–1369.Google Scholar
- [20] . 2015. A novel approach for communication among blind, deaf and dumb people. In 2nd International Conference on Computing for Sustainable Global Development (INDIACom).Google Scholar
- [21] 2014. An Android application to aid uneducated deaf-dumb people. Int. J. Comput. Sci. Mob. Applic. 2, 9 (2014), 1–8.Google Scholar
- [22] 2019. Hand gesture recognition and finger angle estimation via wrist-worn modified barometric pressure sensing. IEEE Trans. Neural Syst. Rehab. Eng. 27, 4 (2019), 724–732.Google Scholar
Cross Ref
- [23] 2014. Assistive translator for deaf & dumb people. Int. J. Electron. Commun. Comput. Eng. 5, 4 (2014), 86–89.Google Scholar
- [24] . 2012. MSP430 based sign language recognizer for dumb patients. Procedia Eng. 38 (2012), 1374–1380.Google Scholar
Cross Ref
- [25] 2015. American Sign Language translation through sensory glove; SignSpeak. Int. J. u-and e-Serv. Sci. Technol. 8, 1 (2015), 131–142.Google Scholar
Cross Ref
- [26] . 2016. Communication interface for deaf-mute people using Microsoft Kinect. In International Conference on Automatic Control and Dynamic Optimization Techniques (ICACDOT).Google Scholar
Cross Ref
- [27] . 2017. Designing a sign language translation system using kinect motion sensor device. In International Conference on Electrical, Computer and Communication Engineering (ECCE).Google Scholar
Cross Ref
- [28] 2017. Multimodal gesture recognition based on the ResC3D network. In IEEE International Conference on Computer Vision Workshops.Google Scholar
Cross Ref
- [29] 2017. Sign language recognition application systems for deaf-mute people: A review based on input-process-output. Procedia Comput. Sci. 116 (2017), 441–448.Google Scholar
Digital Library
- [30] . 2005. Sign language structure: An outline of the visual communication systems of the American deaf. J. Deaf Stud. Deaf Educ. 10, 1 (2005), 3–37.Google Scholar
Cross Ref
- [31] . 2019. Dynamic sign language recognition based on convolutional neural networks and texture maps. In 32nd SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI).Google Scholar
Cross Ref
- [32] . 2014. A. Ds: Aid device for deaf drivers. In 12th Latin American and Caribbean Conference for Engineering and Technology (LACCEI).Google Scholar
- [33] Emergency Vehicle Alert Device (EVADE). https://www.ece.ucf.edu/seniordesign/su2010fa2010/gd/cp.pdf.Google Scholar
- [34] . 2014. Cars for deaf people. Int. Rev. Appl. Eng. Res. 4, 4 (2014), 307–312.Google Scholar
- [35] I. H. Witten, E. Frank, L. Trigg, M. Hall, G. Holmes, and S. J. Cunningham. 1999. Weka: Practical machine learning tools and techniques with Java implementations. (Working paper 99/11). University of Waikato, Department of Computer Science, Hamilton. https://researchcommons.waikato.ac.nz/handle/10289/1040.Google Scholar
- [36] . 2012. Analysis of machine learning algorithms using WEKA. Int. J. Comput. Applic. 975 (2012), 8887.Google Scholar
- [37] 2015. A survey on unsupervised machine learning algorithms for automation, classification and maintenance. Int. J. Comput. Applic. 119, 13 (2015).Google Scholar
Cross Ref
- [38] . 2013. Quantum algorithms for supervised and unsupervised machine learning. arXiv preprint arXiv:1307.0411 (2013).Google Scholar
- [39] . 2012. Analysis of parametric & non parametric classifiers for classification technique using WEKA. Int. J. Inf. Technol. Comput. Sci. 4, 7 (2012).Google Scholar
- [40] . 2013. Intrusion detection system based on K-star classifier and feature set reduction. Int. Organiz. Scient. Res. J. Comput. Eng. 15, 5 (2013), 107–112.Google Scholar
- [41] 2017. An ensemble random forest algorithm for insurance big data analysis. IEEE Access 5 (2017), 16568–16575.Google Scholar
Cross Ref
- [42] . 2016. A low cost environment monitoring system using Raspberry Pi and Arduino with Zigbee. In International Conference on Inventive Computation Technologies (ICICT).Google Scholar
Cross Ref
- [43] . 2018. Flex sensors and MPU6050 sensors responses on smart glove for sign language translation. In IOP Conference Series: Materials Science and Engineering 403, 1 (2018), 012032. IOP Publishing.Google Scholar
Cross Ref
- [44] 2015. Using of measuring system MPU6050 for the determination of the angular velocities and linear accelerations. Automat. Softw. Eng. 11, 1 (2015), 75–80.Google Scholar
- [45] 2018. Stand-alone data logger for solar panel energy system with RTC and SD card. J. Phys.: Conf. Series. 1028, 1 (2018).Google Scholar
- [46] . 2015. Analysis of Weka data mining algorithm RepTree, Simple Cart and RandomTree for classification of Indian news. Int. J. Innov. Sci. Eng. Technol. 2, 2 (2015), 438–446.Google Scholar
- [47] . 2019. Vision based deep learning approach for dynamic Indian sign language recognition in healthcare. In International Conference on Computational Vision and Bio Inspired Computing.Google Scholar
- [48] 2018. American Sign Language alphabet recognition using LM controller. In Institute of Industrial and Systems Engineers Annual Conference.Google Scholar
- [49] . 2016. Hand gesture recognition using LM controller. Int. J. Sci. Res. 5 (2016), 436–441.Google Scholar
- [50] 2015. Sign language recognition using LM controller. In International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA).Google Scholar
- [51] 2017. Static gesture recognition using LM. arXiv preprint arXiv:1705.05884 (2017).Google Scholar
- [52] . 2014. Arabic sign language recognition using the LM controller. In 23rd International Symposium on Industrial Electronics (ISIE).Google Scholar
- [53] . 2017. Arabic static and dynamic gestures recognition using LM. J. Comput. Sci. 13, 8 (2017), 337–354Google Scholar
Cross Ref
- [54] 2016. Hand gesture recognition using LM controller for recognition of Arabic sign language. In Proceeding of the 3rd International Conference on Automation, Control, Engineering, and Computer Science (ACECS'16), Proceedings of Engineering & Technology (PET). 233--238. http://ipco-co.com/PET_Journal/Acecs-2016/39.pdf.Google Scholar
- [55] 2013. Gesture recognition from Indian classical dance using kinect sensor. In 5th International Conference on Computational Intelligence, Communication Systems and Networks.Google Scholar
Digital Library
- [56] . 2017. Combining decision tree and back propagation genetic algorithm neural network for recognizing word gestures in Indonesian sign language using Kinect. J. Theoret. Appl. Inf. Technol. 95, 2 (2017), 292.Google Scholar
- [57] . 2016. Dynamic hand gesture recognition using hidden Markov model by Microsoft Kinect sensor. Int. J. Comput. Applic. 150, 5 (2016), 5–9.Google Scholar
Cross Ref
- [58] 2018. Beyond temporal pooling: Recurrence and temporal convolutions for gesture recognition in video. Int. J. Comput. Vis. 126, 2 (2018), 430–439.Google Scholar
Digital Library
- [59] . 1993. Selecting a classification method by cross-validation. Mach. Learn. 13, 1 (1993), 135– 143.Google Scholar
Digital Library
- [60] 2017. A comprehensive investigation and comparison of machine learning techniques in the domain of heart disease. In IEEE Symposium on Computers and Communications (ISCC).Google Scholar
Cross Ref
- [61] 2010. Cord input: An intuitive, high-accuracy, multi-degree-of-freedom input method for mobile devices. In SIGCHI Conference on Human Factors in Computing Systems.Google Scholar
Digital Library
- [62] 2013. Advanced algorithms for wind turbine power curve modeling. IEEE Trans. Sustain. Energy 4, 3 (2013), 827–835.Google Scholar
Cross Ref
Index Terms
A Novel Assistive Glove to Convert Arabic Sign Language into Speech
Recommendations
Real-time glove and android application for visual and audible Arabic sign language translation
AbstractResearchers can develop new systems to capture, analyze, recognize, memorize and interpret hand gestures with machine learning and sensors. Acoustic communication is a way to convey human opinions, feelings, messages, and information. Deaf and ...
Recognition of gestures in Arabic sign language using neuro-fuzzy systems
Hand gestures play an important role in communication between people during their daily lives. But the extensive use of hand gestures as a mean of communication can be found in sign languages. Sign language is the basic communication method between deaf ...
ArSign: Toward a Mobile Based Arabic Sign Language Translator Using LMC
Universal Access in Human-Computer Interaction. Applications and PracticeAbstractCommunication connects people by allowing them to exchange messages, to express their feelings either verbally or non-verbally. To communicate with their surroundings, hearing impaired people or deaf use sign language. Unfortunately, practicing ...






Comments