International Journal of Engineering
Trends and Technology

Research Article | Open Access | Download PDF

Volume 45 | Number 4 | Year 2017 | Article Id. IJETT-V45P277 | DOI : https://doi.org/10.14445/22315381/IJETT-V45P277

Digital Art using Hand Gesture Control with IOT


Dr.P.Gnanasundari, K. Jawahar, J. Elango, M .Anburaj, K. Harish

Citation :

Dr.P.Gnanasundari, K. Jawahar, J. Elango, M .Anburaj, K. Harish, "Digital Art using Hand Gesture Control with IOT," International Journal of Engineering Trends and Technology (IJETT), vol. 45, no. 4, pp. 412-415, 2017. Crossref, https://doi.org/10.14445/22315381/ IJETT-V45P277

Abstract

This paper proposes the need of people who are handicapped as well as suffering from deaf and dum. The major concern of this paper is to connect them with real world with great esteem. It is based on Human Computer Interaction were the patient is connected to the real world by understanding their sign language into a normal communication. Here hand is considered to be one of the most important parts of our body which is being most frequently used for the interaction in this digital world. Initially the hand glove system is helped in virtual reality in gaming and other aspects. An individual can get connected to the real world people and access their needs effectively.

Keywords

Human Machine Interaction, Virtual reality, sign language, hand glove.

References

[1] ShuiwangJi “3D convolution Neural Networks for human Action Recognition” for Automated recognization of human actionss in surveillance videos, vol 35.(2013).
[2] T.Shiratori and J.K.Hodings,” Accelerometer-based user interface for the control of a physically simulated character”, ACM Transactions on Graphics, vol 27,no. 5,(2015),pp.
[3] P.K.Dick,S.Frank, “Minority Report”, (2012)
[4] The new media consortium, “The Gesture-Based Computing”, Horizon Report 2011 Edition, http://net.educause.edu/ir/library/pdf/HR2011.pdf.
[5] The new media consortium, The Gesture-Based Computing, Horizon Report (2011) Edition.
[6] T.Komura and W.C.Lam, “Real-time Locomotion Control by Sensing Gloves”, Computer Animation and Virtual Words, vol.17, no. 5(2006), pp. 513-525(2012).
[7] T. Shiratori and J. K. Hodgins, “Accelerometer-based user interfaces for the control of a physically simulated character”, ACM Transactions on Graphics, vol. 27, no. 5, (2008), pp. 1–9.
[8] P. K. Dick, S. Frank, “Minority Report”, http://www.imdb.com/title/tt0181689/, (2009).
[9] S. P. Priyal, P. K. Bora, “A study on static hand gesture recognition using moments”, In Proceedings of International Conference on Signal Processing and Communications (SPCOM), (2010), pp. 1-5.
[10] J. Lee-Ferng, J. Ruiz-del-Solar, R. Verschae and M. Correa, “Dynamic gesture recognition for human robot interaction”, In Proceedings of 6th Latin American Robotics Symposium (LARS), (2009), pp. 1-8.
[11] T. Takahashi and F. Kishino, “hand Gesture Coding based on Experiments using a Hand Gesture Interface Device”, SIGCHI Bull, vol. 23, no. 2, (2010), pp. 67-74.
[12]M. Elmezain, A. Al-Hamadi, and B. Michaelis, “A robust method for hand gesture segmentation and recognition using forward spotting scheme in conditional random fields,” in Proceedings of the 20th International Conference on PatternRecognition (ICPR ’10), pp. 3850–3853, August 2010.
[13] C.-S. Lee, S. Y. Chun, and S. W. Park, “Articulated hand configuration and rotation estimation using extended torus manifold embedding,” in Proceedings of the 21st InternationalConference on Pattern Recognition (ICPR ’12), pp. 441–444, November 2012.
[14] M. R. Malgireddy, J. J. Corso, S. Setlur, V. Govindaraju, and D. Mandalapu, “A framework for hand gesture recognition and spotting using subgesture modelling,” in Proceedings of the 20thInternational Conference on Pattern Recognition (ICPR ’10), pp.3780–3783, August 2010.
[15] P. Suryanarayan, A. Subramanian, and D.Mandalapu, “Dynamic hand pose recognition using depth data,” in Proceedings of the20th International Conference on Pattern Recognition (ICPR ’10), pp. 3105–3108, August 2010.
[16] S. Park, S. Yu, J. Kim, S. Kim, and S. Lee, “3D hand tracking using Kalman filter in depth space,” Eurasip Journal on Advancesin Signal Processing, vol. 2012, no. 1, article 36, 2012.
[17] N. Pugeault and R. Bowden, “Spelling it out: real-time ASL fingerspelling recognition,” in Proceedings of the IEEE InternationalConference on Computer Vision Workshops (ICCV ’11), pp. 1114–1119, November 2011.
[18] M. Panwar, “Hand gesture recognition based on shape parameters,” in Proceedings of the International Conference on Computing, Communication and Applications (ICCCA ’12), pp. 1–6, February 2012.
[19] Z. Y. Meng, J.-S. Pan, K.-K. Tseng, and W. Zheng, “Dominant, points based hand finger counting for recognition under skin color extraction in hand gesture control system,” in Proceedingsof the 6th International Conference on Genetic and EvolutionaryComputing (ICGEC ’12), pp. 364–367, August 2012.
[20] R. Harshitha, I. A. Syed, and S. Srivasthava, “Hci using hand gesture recognition for digital sand model,” in Proceedings ofthe 2nd IEEE International Conference on Image InformationProcessing (ICIIP ’13), pp. 453–457, 2013.

Time: 0.0014 sec Memory: 40 KB
Current: 1.9 MB
Peak: 4 MB