Sixth Sense Device

August 28, 2017 | Autor: Sayyad Muzffarali | Categoria: Image Processing
Share Embed


Descrição do Produto

Current Trends in Signal Processing Volume 3, Issue 1, ISSN: 2277-6176 __________________________________________________________________________________________

Sixth Sense Device M. A. Sayyad*, Shitalkumar Pawar, Pritish M. Vibhute SRES College of Engineering Kopergaon, Dist. Ahmadnager-423601, India Abstract Development of user interfaces influences the changes in the human-computer interaction (HCI). Despite tremendous development in input devices, many people still find the interaction with computers an uncomfortable experience. Efforts should be made to adapt computers to our natural means of communication: speech and body language. Human hand gestures have been a mode of nonverbal interaction widely used. Naturalistic and intuitiveness of the hand gesture has been a great motivating factor for the researchers in the area of HCI to put their efforts to research and develop the more promising means of interaction between human and computers. The aim of this paper is to propose a realtime vision system within visual interaction with computer environments through hand gesture recognition, using general-purpose hardware and low-cost sensors, like a simple computer and an USB web camera, so any user could make use of it in his office or at home. The gesture recognition system uses image processing techniques for detection, segmentation, tracking and recognition of hand gestures for converting it to a meaningful command. The interface being proposed here can be substantially applied towards different applications like open the system calculator, watch, image browser, games, etc.

Keywords: Human computer interaction, segmentation, gesture recognition *Author for Correspondence E-mail: [email protected], [email protected],

INTRODUCTION A present, computing provides advanced active environments where humans want to make different types of interface for interaction with media and information without any physical restrictions [1]. Evolution of computer from super computers to desktops to palmtops has made the presence of human computer interaction felt in every walk of life. The keyboard, mouse, etc., lack the sensitivity desired in required application. Eventually, the researchers working the area of human computer interaction made a common emphasis to design and develop the user interfaces capable enough to fulfill the intended performance criteria desired in the active environment. For achieving submergence within an active environment, user shall be capable of interacting effectively in the dynamic world. The term effectively here explicitly indicates the capacity of minimizing the cognitive load and maximizing goal success [2]. For accomplishing it, a successful 3D user interface should be natural, intuitive, powerful enough and easy to learn allowing users to accomplish the necessitated tasks. The present contact-based devices like

accelerometers, data glove [3], sensors/actuators and other input devices are used to capture the user movement and control the selection, manipulation and movements of objects in virtual scenes. The use of these devices is blocked by a number of factors such as slowness, unintuitive, rigidity, and prone to distortion from the physical environment. These devices have cost which is prohibitive for their frequent use by the general user. Hence, they are mostly used by highly trained professionals like the surgeons and ace pilots to train and carry out their operations in the virtual environment. The primary objective of the present system is to use a natural device-free interface that recognizes the hand gestures as commands. The paper is further subdivided as follows: Section ``RELATED WORK’’ provides the related work done in area of gesture recognition for human computer interaction. Section ``SYSTEM DESIGN’’ shows the system design for the gesture recognition system. Section ``RESULT AND ANALYSIS’’ provides the results and analysis

CTSP (2013) 1-10 © STM Journals 2013. All Rights Reserved

Page 1

Sixth Sense Device Sayyad et al. __________________________________________________________________________________________

of gesture recognition system. The effort ends in Conclusion section.

RELATED WORK To improve the interaction in qualitative terms in active environment, it is desired that the means of interaction should be as natural as possible. Human hand gestures could be defined as a set of permutation generated by movements of the hand and arm [4]. Gesture input can be categorized into different categories depending on various characteristics [5]. One of the categories is deictic gesture that refers to reaching for something or pointing an object. Accepting or refusing an action for an event is termed as mimetic gesture. It is useful for language representation of gestures. An iconic gesture is a way of defining an object or its features. Reference [6] presents a hand gesture application in gallery browsing 3D depth data analysis method. It combines the global structure information and the local texture variation in the gesture framework designed. Reference [7] has concluded in their paper that

the gestures performed by users must be logically explainable in order to design a good human computer interface. The current technologies for gesture recognition are not in a state of providing acceptable solutions to the above mentioned problems. One of the major challenges that have evolved in the due course of time is the complexity and robustness that is associated with the analysis and evaluation for gesture recognition. Different researchers have proposed and implemented different pragmatic techniques for gesture as the input for human computer interfaces. Reference [8] presents a free-hand gesture user interface which is based on finding the flight of fiduciary color markers connected to the user’s fingers. The model used for the video presentation, is grounded in its decomposition in a sequence of frames or filmstrip. Reference [9] proposed an interesting technique for real-time tracking of hand capturing gestures through a web camera and an Intel Pentium-based personal computer.

Fig. 1: System Block Diagram. The proposed technique is implemented without any use of sophisticated image processing algorithms and hardware. Reference [10] designed a tilting interface for remote and quick interactions for controlling the directions in an application in ubiquitous environment. It uses a coin-sized 3D

accelerometer sensor for manipulating the application. Controlling VLC media player using hand gesture recognition is done in realtime environment using vision-based techniques [11]. Reference [12] used contactbased devices like accelerometer and EMG sensors for controlling virtual games. There

CTSP (2013) 1-10 © STM Journals 2013. All Rights Reserved

Page 2

Current Trends in Signal Processing Volume 3, Issue 1, ISSN: 2277-6176 __________________________________________________________________________________________

are several studies on the hand movements especially gestures, by modeling the human body and creating a consolidated body of knowledge. On the basis of that body of knowledge, now it is possible to face the problem from a mathematical viewpoint [13]. The major drawbacks of such techniques are they are very complex and highly sophisticated for developing an actionable procedure to make the necessary jigs and tools for any typical application scenarios. This problem can be overcome by pattern recognition methods having lower hardware and computational overhead [15–17]. The sixth-sense technology was developed by Pranav Mistry, a PhD student in the Fluid Interfaces Group at the MIT Media Lab. The sixth-sense technology has a Web 4.0 view of human and machine interactions. Sixth sense integrates digital information into the physical world and its objects, making the entire world your computer. It can turn any surface into a touch-screen for computing, controlled by simple hand gestures [18, 19].

SYSTEM DESIGN Presently available sixth-sense technologybased systems are designed with heavy and costlier projectors. This projector needs to have bright and plane surface, like wall, which makes it unsuitable for outdoor use. These systems use low computational capacity devices for processing an image, which limits scope of real-time processing. The present theory concentrates more on outsourcing the job of processing to separate computing systems. But it requires high-speed connectivity between mobile devices and computing system. Also, communication adds up a significant amount of delay in processing, which makes it unsuitable for real-time applications.

Hand Gesture Image

Web/USB Camera

Processing unit (computer)

Output to open application.

Fig 2: System Algorithm. The procedure starts by acquisition phase. This is the reason that our system has to yield the better performance even in poor illumination, cheaper cameras and even the variation in the light of the environment. The system has to work within the real-time constraints. Hence, the procedure has to limit the parameters to the minimal possible level by removing unnecessary information at the first instance. As the initialization phase is over, the image is used for further processing. As the hand is tracked, a contour is mapped with the corresponding hand which further extracts a corresponding convex hull (area).

The gesture recognition system is designed using an integrated approach for hand gesture recognition. It recognizes static and dynamic hand gestures. Figure 1 shows the system block diagram used for designing an active user interface. Once the image of hand is captured from camera it is then processed through the following phases/algorithms.

CTSP (2013) 1-10 © STM Journals 2013. All Rights Reserved

Fig. 3: Hand gesture vocabulary

Page 3

Sixth Sense Device Sayyad et al. __________________________________________________________________________________________

existence of objects whose color is similar to human skin) and with the balanced lighting condition. Figure 3 shows some hand gestures along with their assigned commands (functions) that are used in the complex environment. Though the gestures have to be mapped for commands in specific applications, the same gesture vocabulary could be reused for mapping different sets of commands according to different range of applications like controlling system calculator, watch, etc. This makes the gesture recognition system more generalized and adaptive towards human computer interaction. Figure 4 shows the intermediate result of the hand obtained for designing the gesture recognition system.

Fig. 4: Intermediate Results of Hand Gesture. The recognition has been done through modeling of the hand by mapping it to the number of vocabulary database for different functionalities like open calculator, watch, etc. Afterwards, the system tracks the number of defects that have been generated by the hand and maps it to a meaningful command. In this technique, we use correlation between the input image and the database image method. The maximum correlation value is used to map it into a proper command.

The hand gesture calculator image opens the system calculator as shown in Figure 6. The user may operate the different commands using hand gesture corresponding to it.

The correlation coefficient between A and B, where A and B are matrices or vectors of the same size. Then 2-D correlation coefficient computes using 𝑟= 𝑚

𝐴𝑚𝑛 − 𝐴 𝐵𝑚𝑛 − 𝐵

𝑚

𝑛

𝑛

𝐴𝑚𝑛 − 𝐴

2

𝑚

𝑛

𝐵𝑚𝑛 − 𝐵

2

Where, Ā = mean2 (A), and 𝐵= mean2 (B)

mean2 (A) and mean2 (B) Average or mean of matrix. The step to execute the algorithm is as below: 1. Take the hand image from the camera. 2. Find out the hand region from the captured image. 3. Then remove the regions which have less area and less threshold value. 4. Convert the image into black and white. 5. Then correlate with the vocabulary database. 6. The maximum value of correlation maps it to a meaningful command.

Fig 5: Effect Added Image Result.

RESULTS AND ANALYSIS The active user interface is designed using the image processing techniques which are implemented in MATLAB. Practical experiments show that the system performs well in environments with little noises (i.e.,

CTSP (2013) 1-10 © STM Journals 2013. All Rights Reserved

Fig. 6: Applications Hand Gestures in Image Open System Calculator.

Page 4

Current Trends in Signal Processing Volume 3, Issue 1, ISSN: 2277-6176 __________________________________________________________________________________________

In order to a find out the performance and viability of the gesture recognition system, it has been tested in real-life application scenarios like open system calculator and watch. The results obtained during execution of the experiments present some interesting results that inspire for experimental implementations in a varied domain of applications. From the test results (see Table I), the calculator classify as known for most of the case. Once the Gaussian noise above 5.5, the system to classify the image correctly. Same as sharpen, once perform sharpen more, which at 3 pixel the system still able to classify it.

For different database, classification is not robust at all, as the system classifies right all. This happens because the finger very closely produces a large region, thus the system is able to classify it correctly. In motion blur, the linear motion of a camera by 20 pixels, with an angle of 45° in a counter clockwise direction, still the system is able to classify correctly. The filter becomes a vector for horizontal and vertical motions. Similarly, all the hand gesture vocabulary images are tested. Figure 7 shows hand gesture images along with their correlation with the database gesture image vocabulary. Though the correlation having maximum value to be mapped for commands in specific application, the gestures vocabulary with six different gestures image is used for mapping different sets of commands according to different ranges of applications like controlling system calculator, watch, etc.

Fig. 7: Input Image and Database Image Correlation Result. Table 1: Test Result for Calculator Image. Sr. No.

Effect added

Same Database

Classified

Correlation factor

1 2 3

Original input image Sharpen more, 3 pixel 1 Gaussian noise 5 level 5.5

Yes Yes Yes Yes Yes

Correctly Correctly Correctly Correctly Unknown

0.9187 0.9054 0.7422 0.6138 0.5457

4

Motion blur, 20 pixel

Yes

Correctly

0.9156

CONCLUSIONS Gesture-based interfaces allow human computer interaction to be in a natural and intuitive manner. The most important advantage of the usage of hand gesture-based input modes is that using this method the user can interact with the application from a distance without any physical interaction with the keyboard or mouse. This paper develops a hand gesture recognition system for interacting with different applications like open the system calculator, watches, etc., and provides a fruitful solution for a user-friendly interface

between human and computer. The gesture vocabulary designed can be further extended for controlling different applications like game control, etc. As the system provides the flexibility to the users and specifically physically challenged users to define the gesture according to their feasibility and ease of use.

ACKNOWLEDGMENTS The authors are thankful to Prof. A. M. Deshpande, lecturer, E & TC Department, for her valuable guidance and motivation for this

CTSP (2013) 1-10 © STM Journals 2013. All Rights Reserved

Page 5

Sixth Sense Device Sayyad et al. __________________________________________________________________________________________

work and also other department staff members and my parents for their credit in completion of this paper.

REFERENCES 1. Choi Yoo-Joo, Lee Je-Sung, Cho WeDuke. A Robust Hand Recognition in Varying Illumination. Advances in Human Computer Interaction. Shane Pinder (Ed.). 2006. 2. Hagan RGO’, Zelinsky A, Rougeaux S. Visual Gesture Interfaces for Virtual Environments. Interaction with Computers 2002; 14:231–250p. 3. Karlsson N, Karlsson B, Wide PA. Glove Equipped with Finger Flexion Sensors as a Command Generator Used in a Fuzzy Control System. IEEE Trans. on Instrumentation and Measurement 1998; 1330–1334p. 4. Krevelen DWFV, Poelman RA. Survey of Augmented Reality Technologies, Applications and Limitation. The International Journal of Virtual Reality 2010; 99(2):1–20p. 5. Mitra S, Acharya T. Gesture Recognition: A survey. IEEE Transactions on Systems. Man and Cybernetics (SMC) - Part C: Applications and Reviews 2007; 37(3):211–324p. 6. Chai Xiujuan, Kongqiao Yikai, Wang Fang. Robust Hand Gesture Analysis and Application in Gallery Browsing. In: Proceeding of ICME, New York: 2009;938–994p. 7. Pavlovic V, Sharma R, Huang TS. Visual Interpretation of Hand Gestures for Human-Computer Interaction: A review. IEEE Trans. on Pattern Analysis and Machine Intelligence (PAMI) 1997; 7(19):677–695p. 8. Miguel José, Dias Salles, Nande Pedro. Image Manipulation through Gestures. In: Proceedings of AICG’04 2004; 1–8p. 9. Liu N, Lovell B. Mmx-accelerated Realtime Hand Tracking System. In: Proceedings of IVCNZ 2001.

10. Atia Ayman, Tanaka Jiro. Interaction with Tilting Gestures in Ubiquitous Environments. In: International Journal of UbiComp (IJU) 2010; 1:3p. 11. Rautaray SS, Agrawal AA. Novel Human Computer Interface Based on Hand Gesture Recognition Using Computer Vision Techniques. In: Proceedings of ACM IITM’10 2010; 292–296p. 12. Xu Z, Xiang C, Wen-hui W. et al. Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors. In: Proceedings of IUI’09 2009; 401–406p. 13. Lee CS, Ghyme SW, Park CJ. The Control of Avatar Motion Using Hand Gesture. In: Proceeding of Virtual Reality Software and technology (VRST) 1998; 59–65p. 14. Lienhart R, Maydt J. An Extended Set of Haar-like Features for Rapid Object Detection. In: Proceedings of ICIP02 2002; 900–903p. 15. Messom CH, Barczak ALC. Fast and Efficient Rotated Haar-like Features Using Rotated Integral Images. In: Proceedings of Australian Conference on Robotics and Automation (ACRA2006) 2006; 1–6p. 16. Bradski GR. Computer Video Faces Tracking for Use in a Perceptual User Interface. Intel Technology Journal 1998; Q2:1–15p. 17. http://www.ted.com/talks/pranav_mistry_t he_thrilling_potential_of_sixthsense_tech nology.htm 18. Pranav Mistry, Pattie Maes, Liyan Chang. WUW – Wear Ur World – A Wearable Gestural Interface. MIT Media Lab, ACM 978-1-60558-246-7/09/04. 19. Mistry P, Maes P. Sixth Sense – A Wearable Gestural Interface. Proceedings of SIGGRAPH Asia 2009. Sketch, Yokohama, Japan: 2009.

CTSP (2013) 1-10 © STM Journals 2013. All Rights Reserved

Page 6

Lihat lebih banyak...

Comentários

Copyright © 2017 DADOSPDF Inc.