Hand Gesture Recognition for Physical Impairment Peoples

International Journal of Computer Science and Engineering
© 2017 by SSRG - IJCSE Journal
Volume 4 Issue 10
Year of Publication : 2017
Authors : P.Dhivya Bharathy, P.Preethi, K.Karthick, S.Sangeetha

How to Cite?

P.Dhivya Bharathy, P.Preethi, K.Karthick, S.Sangeetha, "Hand Gesture Recognition for Physical Impairment Peoples," SSRG International Journal of Computer Science and Engineering , vol. 4,  no. 10, pp. ):6-10 , 2017. Crossref, https://doi.org/10.14445/23488387/IJCSE-V4I10P102


Physically disabled and mentally challenged people are an important part of our society that has not yet received the same opportunities as others in their inclusion in the Information Society. Therefore, it is necessary to develop easily accessible systems for computers to achieve their inclusion within the new technologies. This paper presents a project whose objective is to draw disabled people closer to new technologies. It presents a vision-based user interface designed to achieve computer accessibility for disabled users with motor impairments. The interface automatically finds the user’s face and tracks it through time to recognize gestures within the face region in real time and also implement vision based hand gesture recognition system for Natural Human Computer Interface. Hand tracking and segmentation are the primary steps for any hand gesture recognition system. The aim of this project is to develop robust and efficient hand segmentation algorithm where three algorithms for hand segmentation using different color spaces with required thresholds have were utilized. Hand tracking and segmentation algorithm is found to be most efficient to handle the challenges of vision based system such as skin color detection, complex background removal and variable lighting condition. Noise may contain, sometime, in the segmented image due to dynamic background. Tracking algorithm was developed and applied on the segmented hand contour for removal of unwanted background noise.


Tracking algorithm was developed and applied on the segmented hand contour for removal of unwanted background noise.


[1] J. A. Jacko, “Human–computer interaction design and development approaches,” in Proc. 14th HCI Int. Conf., 2011, pp. 169–180.
[2] Y. H. Yin, Y. J. Fan, and L. D. Xu, “EMG and EPPintegrated human– machine interface between the paralyzed and rehabilitation exoskeleton,” IEEE Trans. Inf. Technol. Biomed., vol. 16, no. 4, pp. 542–549, Jul. 2012.
[3] K. L. Kitto, “Development of a low-cost sip and puff mouse,” in Proc. 16th Annu Conf. RESNA, 1993, pp. 452– 454.
[4] H. Jiang, J. P. Wachs, and B. S. Duerstock, “Facilitated gesture recognition based interfaces for people with upper extremity physical impairments,” in Proc. Pattern Recogn., Image Anal., Comput. Vision,Applicat., 2012, pp. 228–235.
[5] J. Wachs, M. K¨olsch, H. Stern, and Y. Edan, “Vision-based hand gesture applications: Challenges and innovations,” Commun. ACM, CoverArticle, vol. 54, no. 2, pp. 60–71, 2011.
[6] Z. Li and R. Jarvis, “A multimodal gesture recognition system in ahuman–robot interaction scenario,” in Proc. IEEE Int. Workshop RoboticSensors Environments, 2009, pp. 41– 46.
[7] E. A. Suma, B. Lange, A. Rizzo, D. M. Krum, and M. Bolas, “FAAST:The flexible action and articulated skeleton toolkit,” in Proc. IEEEVirtual Reality Conf., Mar. 2011, pp. 247–248.
[8] M. Walters, S. Marcos, D. S. Syrdal, and K. Dautenhahn, “An interactive game with a robot: People’s perceptions of robot faces and a gesture based user interface,” in Proc. 6th Int. Conf. Adv. Computer–HumanInteractions, 2013, pp. 123–128.
[9] O. Brdiczka, M. Langet, J. Maisonnasse, and J. L. Crowley, “Detection human behavior models from multimodal observation in a smart home,” IEEE Trans. Autom. Sci. Eng., vol. 6, no. 4, pp. 588–597, Oct. 2009.
[10] M. A. Cook and J. M. Polgar, Cook & Hussey’s Assistive Technologies: Principles and Practice, 3rd ed. Maryland Heights, MO, USA: MosbyElsevier, 2008, pp. 3–33.
[11] Varsha Dixit and Anupam Agrawal"Real Time Hand Detection & Tracking for Dynamic Gesture Recognition",2015, 08,38-44.
[12] D. Debuse, C. Gibb, and C. Chandler, “Effects of hippotherapy on people with cerebral palsy from the users’ perspective: A qualitative study,” Physiotherapy Theory Practice, vol. 25, no. 3, pp. 174–192, 2009.
[13] A. Stenila,M.Merlin Asuntha "Human Computer Interaction Based HEMD using Hand Gesture"International Journal of Advanced Engineering, Management and Science (IJAEMS)[Vol-3, Issue-5, May- 2017].
[14] Min-Soo Kim1 and Choong Ho Lee2"Hand and Gesture Recognition for Kinect v2 Sensor in the Near Distance Where Depth Data Are Not Provided" International Journal of Software Engineering and Its Applications Vol. 10, No. 12 (2016), pp. 407-418.
[15] Xiao-Lei Zhang and Ji Wu, “Deep Belief Networks Based Voice Activity Detection” IEEE Transactions on Audio, Speech and Language Processing, vol 21,no.4,April2013.