HAND GESTURE RECOGNITION USING AI/ML
Keywords:
Human-computer interaction, hand gesture detection, artificial intelligence, machine learning, convolutional and recurrent neural networksAbstract
An essential component of human-computer interaction is hand gesture recognition, which enables natural and instinctive verbal communication between users and computers. This paper offers a thorough assessment of recent developments in the field of hand gesture identification through the use of system research (ML) and artificial intelligence (AI) techniques. We discuss the difficult scenarios, approaches, and packages related to reputation structures for hand gestures. The effectiveness of various AI and ML techniques, including deep learning models like recurrent neural networks (RNNs) and convolutional neural networks (CNNs), in identifying hand motions from camera or sensor data is investigated. Since sign language is made up of continuous movements, in order for it to become popular, it must capture motion facts across a few successive frames from a video. This study describes two-way communication between the deaf, the dumb, and normal people. As a result, the suggested gadget can translate sign language into text and voice.
Downloads
References
Rini Akmeliawati, Melanie Po-Leen Ooi and Ye Chow Kuang, “Real-Time Malaysian Sign Language Translation Using Colour Segmentation and Neural Network”, IEEE on Instrumentation and Measurement Technology Conference Proceeding, Warsaw, Poland 2006, pp. 1-6.
Azadeh Kiani Sarkalehl, Fereshteh Poorahangaryan, Bahman Zan, Ali Karami, “A Neural Network Based System for Persian Sign Language Recognition” IEEE International Conference on Signal and Image Processing Applications 2009.
Incertis, J. Bermejo, and E. Casanova, “Hand Gesture Recognition for Deaf People Interfacing,” The 18th International Conference on Pattern Recognition, 2006 IEEE.
R. Feris, M. Turk, R. Raskar, K. Tan, and G. Ohashi. "Exploiting depth discontinuities for vision-based
fingerspelling recognition". In the IEEE Workshop on Real-time Vision for Human-Computer Interaction, 2004.
T. Starner and A. Pentland, "Real-time American sign language recognition from video using hidden markov models", Technical Report, M.I.T Media Laboratory Perceptual Computing Section, Technical Report No. 375, 1995.
T. Starner and A. Pentland, "Real-time American sign language recognition from video using hidden markov models", Technical Report, M.I.T Media Laboratory Perceptual Computing Section, Technical Report No. 375, 1995. Stephan Liwicki, Mark Everingham, "Automatic Recognition of Fingers pelled Words in British Sign Language", School of Computing University of Leeds.
C. Ong, W. Ibrahim, and S. M. Sapuan, "Recent Advances in Hand Gesture Recognition for Human-Computer Interaction: A Review," Computer Methods and Programs in Biomedicine, vol. 175, pp. 137-159, 2019.
Y. LeCun, F. Huang, and L. Bottou, "Learning Methods for Generic Object Recognition with Invariance to Pose and Lighting," Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2004.
C. Li, Y. Tian, S. Zhang, and L. Wang, "Hand Gesture Recognition Based on Convolutional Neural Networks," Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2017.
M. Z. U. Khan, A. H. Abdullah, M. F. B. Abdullah, and K. M. A. M. Amin, "A Comprehensive Review of Hand Gesture Recognition Techniques and Its Applications," Artificial Intelligence Review, vol. 52, no. 1, pp. 423-477, 2019.
S. S. Malik and K. N. Jha, "Hand Gesture Recognition Techniques: A Review," Journal of Ambient Intelligence and Humanized Computing, vol. 11, no. 1, pp. 311-340, 2020.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 IEJRD
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.