Abstract
Many deaf people worldwide face problems with integrating into society and interacting with people who do not understand sign language. This can lead to isolation and difficulty in expressing feelings. In this research, our primary goal is to help deaf people communicate, express their feelings, and socialize with others. Toward that end, 40 Arabic words that are commonly used in social interactions were used to build a dataset of hand movements used by deaf people to express these words. These movements were recorded using a Leap Motion Controller (LMC). The resulting dataset consists of 1,579 instances and 112 features, recorded with the help of five deaf persons. Feature reduction and oversampling techniques were applied to analyze the dataset. Machine learning algorithms were then used to build a model that is able to classify any given hand posture or gesture into one of those 40 words. This work compared the performance of nine classification algorithms: Random Forest, Decision Table, Classification via Regression, K-Nearest Neighbor (KNN), Simple Logistic, Input Mapped Classifier, Random Tree, J48, and Bayes network. Results show that the Random Forest model achieved the highest accuracy with over 90%, outperforming the other eight models. Subsequently, a usability study was conducted by 10 deaf people to test the effectiveness of the proposed assistive device. The results suggest that the proposed device is useful for facilitating social communication with deaf people. It also suggests that the device was preferred, when compared with other relevant devices.
- [1] . Retrieved Feb. 2022 from https://www.atia.org.Google Scholar
- [2] . Retrieved Feb. 2022 from https://www.who.int/.Google Scholar
- [3] . Retrieved Feb. 2022 from https://www.sense.org.uk/get-support/information-and-advice/communication/sign-language/.Google Scholar
- [4] . 2010. Using mobile devices to support communication between emergency medical responders and deaf people. In Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services. 7–16.Google Scholar
Digital Library
- [5] . 2017. Bengali Sign Language Recognition using dynamic skin calibration and geometric hashing. In Proceedings of the 2017 6th International Conference on Informatics, Electronics and Vision and 2017 7th International Symposium on Computational Medical and Health Technology (ICIEV-ISCMHT’17). IEEE, 1–5.Google Scholar
Cross Ref
- [6] . 2018. Gradient feature based static sign language recognition. International Journal of Computer Sciences and Engineering 6, 12 (2018), 531–534.Google Scholar
Cross Ref
- [7] . 2017. On how deaf people might use speech to control devices. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. 383–384.Google Scholar
Digital Library
- [8] . 2016. A review paper on sign language recognition system for deaf and dumb people using image processing. International Journal of Engineering Research & Technology (IJERT) 5, 3 (2016), 590–592.Google Scholar
- [9] . 2018. ACE assisted communication for education: Architecture to support blind and deaf communication. In Proceedings of the 2018 IEEE Global Engineering Education Conference (EDUCON’18). IEEE, 1015–1023.Google Scholar
Cross Ref
- [10] . 2019. Vision based deep learning approach for dynamic Indian sign language recognition in healthcare. In Proceedings of the International Conference on Computational Vision and Bio Inspired Computing. Springer, Cham, 371–383.Google Scholar
- [11] . 2018. American sign language alphabet recognition using Leap Motion Controller. In Proceedings of the 2018 Institute of Industrial and Systems Engineers Annual Conference (IISE’18).Google Scholar
- [12] . 2016. Hand movement and gesture recognition using Leap Motion Controller. Virtual Reality, Course Report.Google Scholar
- [13] . 2016. Hand gesture recognition using Leap Motion Controller. International Journal of Science and Research (IJSR) ISSN (Online) 5, 10 (2016), 436–441.Google Scholar
- [14] . 2015. Sign language recognition using Leap Motion Controller. In Proceedings of the International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA’15). The Steering Committee of The World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp).Google Scholar
- [15] . 2017. Static gesture recognition using leap motion. arXiv:1705.05884. Retrieved from https://arxiv.org/abs/1705.05884v1.Google Scholar
- [16] . 2017. Arabic static and dynamic gestures recognition using leap motion. Journal of Computational Science 13, 8 (2017), 337–354.Google Scholar
Cross Ref
- [17] . 2016. Hand gesture recognition using Leap Motion Controller for recognition of Arabic sign language. In 3rd International Conference ACECS’16. 233–238.Google Scholar
- [18] . 2014. Arabic sign language recognition using the Leap Motion Controller. In Proceedings of the 2014 IEEE 23rd International Symposium on Industrial Electronics (ISIE’14). IEEE, 960–965.Google Scholar
Cross Ref
- [19] . 2021. Arabic sign language recognition using Ada-Boosting based on a Leap Motion Controller. International Journal of Information Technology 13, 3 (2021), 1221–1234.Google Scholar
Cross Ref
- [20] . 2020. Enhancing the Recognition of Arabic Sign Language by using Deep Learning and Leap Motion Controller. International Journal of Scientific & Technology Research 9, 4 (2020), 1865–1870.Google Scholar
- [21] . 2020. ArSign: Toward a mobile based Arabic sign language translator using LMC. In International Conference on Human-Computer Interaction. Springer, Cham, 92–101.Google Scholar
- [22] . 2021. Co-design of gesture-based Arabic sign language (ArSL) recognition. In Proceedings of the International Conference on Intelligent Human Systems Integration. Springer, Cham, 715–720.Google Scholar
Cross Ref
- [23] . 2018. Smart hand gloves for disable people. International Research Journal of Engineering and Technology (IRJET) 5 (2018), 1423–1426.Google Scholar
- [24] . 2015. Sign language translator and gesture recognition. In 2015 Global Summit on Computer & Information Technology (GSCIT). IEEE, 1–6.Google Scholar
- [25] . 2019. A smart glove with integrated triboelectric nanogenerator for self-powered gesture recognition and language expression. Science and Technology of Advanced Materials 20, 1 (2019), 964–971.Google Scholar
Cross Ref
- [26] . 2018. ASL recognition quality analysis based on sensory gloves and MLP neural network. American Scientific Research Journal for Engineering, Technology, and Sciences (ASRJETS) 47, 1 (2018), 1–20.Google Scholar
- [27] . 2016. Dynamic hand gesture recognition using hidden Markov model by Microsoft Kinect sensor. International Journal of Computer Applications 150, 5 (2016), 5–9.Google Scholar
Cross Ref
- [28] . 2017. Combining decision tree and back propagation genetic algorithm neural network for recognizing word gestures in Indonesian sign language using Kinect. Journal of Theoretical and Applied Information Technology 95, 2 (2017), 292.Google Scholar
- [29] . Gesture recognition from Indian classical dance using Kinect sensor. In Proceedings of the 2013 5th International Conference on Computational Intelligence, Communication Systems and Networks. IEEE, 3–8.Google Scholar
Digital Library
- [30] . 2016. Communication interface for deaf-mute people using Microsoft Kinect. In Proceedings of the 2016 International Conference on Automatic Control and Dynamic Optimization Techniques (ICACDOT’16). IEEE, 640–644.Google Scholar
- [31] . 2012. Development of a fall detection system with Microsoft Kinect. In Robot Intelligence Technology and Applications. Springer, Berlin, 623–630.Google Scholar
- [32] . 2019. Artificial neural network based gait recognition using Kinect sensor. IEEE Access 7 (2019), 162708–162722.Google Scholar
Cross Ref
- [33] . 2017. Designing a sign language translation system using Kinect motion sensor device. In Proceedings of the 2017 International Conference on Electrical, Computer and Communication Engineering (ECCE’17). IEEE, 344–349.Google Scholar
Cross Ref
- [34] . 2017. Multimodal gesture recognition based on the resc3d network. In Proceedings of the IEEE International Conference on Computer Vision Workshops. 3047–3055.Google Scholar
Cross Ref
- [35] . 2017. Sign language recognition application systems for deaf-mute people: A review based on input-process-output. Procedia Computer Science 116 (2017), 441–448.Google Scholar
Digital Library
- [36] . 2005. Sign language structure: An outline of the visual communication systems of the American deaf. Journal of Deaf Studies and Deaf Education 10, 1 (2005), 3–37.Google Scholar
Cross Ref
- [37] . 2016. Deep convolutional neural networks for computer-aided detection: CNN architectures, dataset characteristics and transfer learning. IEEE Transactions on Medical Imaging 35, 5 (2016), 1285–1298.Google Scholar
- [38] . 2019. Dynamic sign language recognition based on convolutional neural networks and texture maps. In Proceedings of the 2019 32nd SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI’19). IEEE, 265–272.Google Scholar
Cross Ref
- [39] . 2018. Beyond temporal pooling: Recurrence and temporal convolutions for gesture recognition in video. International Journal of Computer Vision 126, 2 (2018), 430–439.Google Scholar
Digital Library
- [40] . 2019. Cross-validation. Encyclopedia of Bioinformatics and Computational Biology 1 (2019), 542–545.Google Scholar
Cross Ref
- [41] , Retrieved Feb. 2022 from https://arabicprogrammer.com/.Google Scholar
- [42] . 2017. Predicting lung disease severity evaluation and comparison of hybird decision tree algorithm. Indian Journal of Innovations and Developments 6 (2017), 1.Google Scholar
- [43] . 2016. WEKA Manual for Version 3-9-1. University of Waikato, Hamilton, New Zealand.Google Scholar
- [44] . 2012. An overview of classification algorithms for imbalanced datasets. International Journal of Emerging Technology and Advanced Engineering 2, 4 (2012), 42–47.Google Scholar
Index Terms
A Novel Social Interaction Assistive Device for Arab Deaf People
Recommendations
IoT based Assistive Device for Deaf, Dumb and Blind People
AbstractFocusing and addressing the problems faced by the differently abled people such as visually, audibly and vocally challenged, through a single device is a tough job. A lot of research has been done on each problem and solutions have been proposed ...
Optimizing web-accessibility for deaf people and the hearing impaired utilizing a sign language dictionary embedded in a browser
Deaf people have certain problems navigating on the Internet. This is a subject, which has not received adequate scientific attention. Via an experiment with both deaf and hearing people, text was identified as a problem for deaf people when navigating ...
Enhancing readability of web documents by text augmentation for deaf people
WIMS '13: Proceedings of the 3rd International Conference on Web Intelligence, Mining and SemanticsDeaf people have particular difficulty in understanding text-based web documents because their mother language, or sign language, is essentially visually oriented. To enhance the readability of text-based web documents for deaf people, we propose a news ...






Comments