Real Time Multiple Hand Gesture Recognition System for Human Computer Interaction
Автор: Siddharth S. Rautaray, Anupam Agrawal
Журнал: International Journal of Intelligent Systems and Applications(IJISA) @ijisa
Статья в выпуске: 5 vol.4, 2012 года.
Бесплатный доступ
With the increasing use of computing devices in day to day life, the need of user friendly interfaces has lead towards the evolution of different types of interfaces for human computer interaction. Real time vision based hand gesture recognition affords users the ability to interact with computers in more natural and intuitive ways. Direct use of hands as an input device is an attractive method which can communicate much more information by itself in comparison to mice, joysticks etc allowing a greater number of recognition system that can be used in a variety of human computer interaction applications. The gesture recognition system consist of three main modules like hand segmentation, hand tracking and gesture recognition from hand features. The designed system further integrated with different applications like image browser, virtual game etc. possibilities for human computer interaction. Computer Vision based systems has the potential to provide more natural, non-contact solutions. The present research work focuses on to design and develops a practical framework for real time hand gesture.
Real time, gesture recognition, human computer interaction, tracking
Короткий адрес: https://sciup.org/15010256
IDR: 15010256
Список литературы Real Time Multiple Hand Gesture Recognition System for Human Computer Interaction
- Conic, N., Cerseato, P., De Natale, F. G. B.,: Natural Human- Machine Interface using an Interactive Virtual Blackboard, In Proceeding of ICIP 2007, pp.181-184, (2007).
- A. Vardy, J. Robinson, Li-Te Cheng, “The Wrist Cam as input device”, Wearable Computers, 1999
- Wong Tai Man, Sun Han Qiu, Wong Kin Hong, “ThumbStick: A Novel Virtual Hand Gesture Interface”, In Proceedings of the IEEE International Workshop on Robots and human Interactive Communication, 300-305.
- W. T., Freeman, D. B Anderson, and P. et al. Beardsley. “Computer vision for interactive computer graphics. IEEE Trans. On Computer Graphics and Applications, 18:42-53, 1998.
- N. Soontranon, S. Aramvith, and T. H. Chalidabhongse, “Improved face and hand tracking for sign language Recognition”. IEEE Trans. On ITCC, 2:141-146, 2005.
- V. Pavlovic, R. Sharma and T.S. Huang, “Visual interpretation of hand gestures for human-computer interaction: A review,” IEEE Trans. on Pattern Analysis and Machine Intelligence (PAMI), vol. 7(19), pp. 677–695, 1997.
- Xiujuan Chai, Yikai Fang and Kongqiao Wang, “Robust hand gesture analysis and application in gallery browsing,” In Proceeding of ICME, New York, pp. 938-94, 2009.
- José Miguel Salles Dias, Pedro Nande, Pedro Santos, Nuno Barata and André Correia, “Image Manipulation through Gestures,” In Proceedings of AICG’04, pp. 1-8, 2004.
- Ayman Atia and Jiro Tanaka, “Interaction with Tilting Gestures in Ubiquitous Environments,” In International Journal of UbiComp (IJU), Vol.1, No.3, 2010.
- S.S. Rautaray and A. Agrawal, “A Novel Human Computer Interface Based On Hand Gesture Recognition Using Computer Vision Techniques,” In Proceedings of ACM IITM’10, pp. 292-296, 2010.
- Z. Xu, C. Xiang, W. Wen-hui, Y. Ji-hai, V. Lantz and W. Kong-qiao, “ Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors,” In Proceedings of IUI’09, pp. 401-406, 2009.
- C. S. Lee, S. W. Ghyme, C. J. Park and K. Wohn, “The Control of avatar motion using hand gesture,” In Proceeding of Virtual Reality Software and technology (VRST), pp. 59-65, 1998.
- X. Zhang, X. Chen, Y. Li, V. Lantz, K. Wang and J. Yang, “A framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors,” IEEE Trans. On Systems, Man and Cybernetics- Part A: Systems and Humans, pp. 1-13, 2011.
- N. Conci, P. Cerseato and F. G. B. De Natale, “Natural Human- Machine Interface using an Interactive Virtual Blackboard,” In Proceeding of ICIP 2007, pp. 181-184, 2007.
- B. Yi, F. C. Harris Jr., L. Wang and Y. Yan, “Real-time natural hand gestures”, In Proceedings of IEEE Computing in science and engineering, pp. 92-96, 2005.
- R. Lienhart and J. Maydt, “An extended set of Haar-like features for rapid object detection,” In Proceedings of ICIP02, pp. 900-903, 2002.
- C.H, Messom and A.L.C. Barczak, Fast and Efficient Rotated Haar-like Features Using Rotated Integral Images", In Proceedings of Australian Conference on Robotics and Automation (ACRA2006), pp. 1-6, 2006.
- Nguyen Dang Binh, Enokida Shuichi, Toshiaki Ejima, “Real-Time Hand Tracking and Gesture Recognition System”, 2005.
- G. R. Bradski. Computer video faces tracking for use in a perceptual user interface. Intel Technology Journal, Q2, pp. 1-15, 1998.
- R. J. K. Jacob, “Human-Computer Interaction”, ACM Computing surveys, 177-179, March 1996.
- T. Brown and R. C. Thomas, “Finger tracking for the digital desk”. IEEE Trans. On AUIC, 11-16, 2000.
- J. Shi, C. Tomasi. “Good Features to track”, IEEE Conference on Computer Vision and Pattern Recognition, 593-600, 1994.
- Q. Chen, N.D. Georganas, E.M. Petriu, “Realtime Vision-based Hand Gesture Recognition Using Haar-like Features,” In Proceedings of. IEEE Instrument and Measurement Technology Conference, 2007.
- Ismail, N. A., O’Brien, A.,: Enabling Multimodal Interaction in Web-Based Personal Digital Photo Browsing,” Proceedings of the International Conference on Computer and Communication Engineering 2008, Kuala Lumpur, Malaysia, May 13-15, pp. 907-910, (2008).
- Moeslund, T. B., Norgaard, L.: A brief overview of hand gestures used in wearable human computer interfaces, Technical report, Aalborg University, Denmark, (2002).