Home            Contact us            FAQs
    
      Journal Home      |      Aim & Scope     |     Author(s) Information      |      Editorial Board      |      MSP Download Statistics

     Research Journal of Applied Sciences, Engineering and Technology


Arabic Sign Language Recognition Using Kinect Sensor

1Abdel-Gawad Abdel-Rabouh Abdel-Samie, 1F.A. Elmisery, 1Ayman M.Brisha and 2Ahmed H. Khalil
1Department of Electronics Technology, Faculty of Industrial Education, Beni-Suef University, Beni-Suef
2Department of Electronics and Communications, Faculty of Engineering, Cairo University, Giza, Egypt
Research Journal of Applied Sciences, Engineering and Technology  2018  2:57-67
http://dx.doi.org/10.19026/rjaset.15.5292  |  © The Author(s) 2018
Received: August 8, 2017  |  Accepted: September 14, 2017  |  Published: February 15, 2018

Abstract

This study introduces a Real Time System for automatic Arabic sign language recognition system based on Dynamic time warping matching algorithm. The communication between human and machines or between people could done using gestures called sign language. The aim of the sign language recognition is to give an exact and convenient mechanism to transcribe sign gestures into meaningful text or speech so that communication between deaf and hearing society can easily be made. In this study we introduce a translator based on Dynamic Time Warping, where each signed word is coordinating and matching among database, then display the text and the corresponding pronunciation of the income sign. We using the Microsoft's Kinect sensor to catch the sign. We have built our data using a large set of samples for a dictionary of 30 isolated words homemade signs from the Standard Arabic sign language. The system operates in different modes including online, signer-dependent and signer-independent modes. The presented system allows the signer to do signs freely and naturally. Experimental results using real Arabic sign language data collected show that the presented system has higher recognition rate compared with others for all modes. For signer-dependent online case, the system achieves recognition rate of 97.58%. On the other hand, for signer-independent online case, the system achieves a recognition rate of 95.25%.

Keywords:

Arabic sign language recognition ArSL, DTWA, kinect, microsoft visual studio, real-time,


References

  1. Akila, A. and E. Chandra, 2013. Slope finder - a distance measure for DTW based isolated word speech recognition. Int. J. Eng. Comput. Sci., 2(12): 3411-3417.
    Direct Link
  2. AL-Rousan, M., K. Assaleh and A. Tala'a, 2009. Video-based signer-independent Arabic sign language recognition using hidden Markov models. Appl. Soft Comput., 9(3): 990-992.
    CrossRef    
  3. Arab League of States, 2001. The Arabic Dictionary of the Deaf. The Arab Sign Language Dictionary. Arab Organization for Education, Culture and Science in Tunis, Department of Social Development of the League of Arab States and the Arab Union of Deaf Welfare Organizations.
    Direct Link
  4. Capilla, D.M., 2012. Sign Language Translator Using Microsoft Kinect XBOX 360TM. In: Qi, H. and F. Meriaudeaum (Eds.), 4: 2-5. Retrieved from: https://pdfs.semanticscholar.org/165a/1a7c529f51b91ae587496d41603e560d1fe9.pdf.
    Direct Link
  5. Celebi, S., A.S. Aydin, T.T. Temiz and T. Arici, 2013. Gesture recognition using skeleton data with weighted dynamic time warping. Proceeding of the International Conference on Computer Vision Theory and Applications (VISIGRAPP, 2013)1: 2-2.
    Direct Link
  6. Chen, Y., B. Luo, Y.L. Chen, G. Liang and X. Wu, 2015. A real-time dynamic hand gesture recognition system using kinect sensor. Proceeding of the IEEE Conference on Robotics and Biomimetics. Zhuhai, China, 4: 6-9.
    CrossRef    
  7. Dong, C., M.C. Leu and Z. Yin, 2015. American sign language alphabet recognition using Microsoft Kinect. Proceeding of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). Boston, MA, USA.
    Direct Link
  8. Gavrila, D.M., 1999. The visual analysis of human movement: A survey. Comput. Vis. Image Und., 73(1): 82-98.
    CrossRef    
  9. Han, J., L. Shao, D. Xu and J. Shotton, 2013. Enhanced computer vision with Microsoft Kinect sensor: A review. IEEE T. Cybernetics, 43(5): 1318-1334.
    CrossRef    PMid:23807480    
  10. Hussein, M.A., A.S. Ali, F.A. Elmisery and R. Mostafa, 2014. Motion control of robot by using kinect sensor. Res. J. Appl. Sci. Eng. Technol., 8(11): 1384-1388.
    CrossRef    
  11. ISikligil, E., 2014. A method for isolated sign recognition with kinect. M.Sc. Thesis, Department of Computer Engineering, Middle East Technical University.
    Direct Link
  12. Kyatanavar, R.D. and P.R. Futane, 2012. Comparative study of sign language recognition systems. Int. J. Sci. Res. Publ., 2(6): 1-3.
    Direct Link
  13. Maruvada, S., 2017. 3-D hand gesture recognition with different temporal behaviors using HMM and kinect. M.Sc. Thesis, University of Magdeburg.
    Direct Link
  14. Mustafa, E. and K. Demopoulos, 2014. Sign language recognition using Kinect. Proceeding of the 9th South East European Doctoral Student Conference. Thessaloniki, Greece, pp: 1-15.
    Direct Link
  15. Pavlovic, V.I., R. Sharma and T.S. Huang, 1997. Visual interpretation of hand gestures for human-computer interaction: A review. IEEE T. Pattern Anal., 19(7): 677-695.
    CrossRef    
  16. Poddar, N., S. Rao, S. Sawant, V. Somavanshi and S. Chandak, 2015. Study of sign language translation using gesture recognition. Int. J. Adv. Res. Comput. Commun. Eng., 4(2): 264-267.
    CrossRef    
  17. Raheja, J.L., M. Minhas, D. Prashanth, T. Shah, A. Chaudhary, 2015. Robust gesture recognition using Kinect: A comparison between DTW and HMM. Optik Int. J. Light Electr. Optics, 126(11-12): 1098-1104.
    CrossRef    
  18. Ribó, A., D. Warchol and M. Oszust 2016. An approach to gesture recognition with skeletal data using dynamic time warping and nearest neighbour classifier. I.J. Intell. Syst. Appl., 6: 1-8.
    CrossRef    
  19. Starner, T., J. Weaver and A. Pentland, 1997. A wearable computer-based American sign language recogniser. Pers. Technol., 1(4): 241-250.
    CrossRef    
  20. Thakur, A.S. and N. Sahayam, 2013. Speech recognition using euclidean distance. Int. J. Emerg. Technol. Adv. Eng. 3(3): 587-590.
    Direct Link
  21. Vogler, C., H. Sun and D. Metaxas, 2000. A framework for motion recognition with applications to American sign language and gait recognition. Proceeding of the IEEE Workshop on Human Motion, pp: 33-38.
    CrossRef    
  22. We, Y. and T.S. Huang, 1999. Vision-Based Gesture Recognition: A Review. In: Braffort, A., R. Gherbi, S. Gibet, D. Teil and J. Richardson (Eds.), Gesture-Based Communication in Human-Computer Interaction. GW 1999. Lecture Notes in Computer Science (Lecture Notes in Artificial Intelligence), Vol. 1739. Springer, Berlin, Heidelberg.
    Direct Link
  23. Yang, H.D., 2014. Sign language recognition with the kinect sensor based on conditional random fields. Sensors, 15(1): 135-147.
    CrossRef    PMid:25609039 PMCid:PMC4327011    
  24. Youssif, A.A.A., A.E. Aboutabl and H.H. Ali, 2011. Arabic sign language (ArSL) recognition system using HMM. Int. J. Adv. Comput. Sci. Appl., 2(11): 45-51.
    Direct Link

Competing interests

The authors have no competing interests.

Open Access Policy

This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Copyright

The authors have no competing interests.

ISSN (Online):  2040-7467
ISSN (Print):   2040-7459
Submit Manuscript
   Information
   Sales & Services
Home   |  Contact us   |  About us   |  Privacy Policy
Copyright © 2024. MAXWELL Scientific Publication Corp., All rights reserved