0% found this document useful (0 votes)
8 views4 pages

Surveyreport 1

survey report of virtual trial room

Uploaded by

Sudharshan Dinnu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views4 pages

Surveyreport 1

survey report of virtual trial room

Uploaded by

Sudharshan Dinnu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

HAND GESTURE RECOGNITION SYSTEM FOR DUMB AND

DEAF PEOPLE

Abstract - Hand Gesture recognition system is more familiar or easy to use to the human beings.
one of the best technique has developed in last few The communication between the dumb, visually
year, the reason behind to get success is ability to impaired person are made only by their expressions
cooperate with machine. Body gestures are and their hand gestures, We may design a system
consider most common way to communication which is able to communicate or can be median for
along human and personal computer in virtual society.
world. To developed effective algorithm to that can
recognize, interpret, process, and simulate human 2. BLOCK DIAGRAM
affections. Advance computing has ability to
humanize digital interactions. This pepper
introduces application using Computer Vision(CV)
to recognize hand gesture. A camera takes live
footage and analysis gestures on the basis of
analysis system takes decision and recognize
gestures.

Key Words: Hand Gesture, Open CV, MediaPipe,


Hand land marks.

1. INTRODUCTION
In country like India we found 40 million people are
visually impaired person and 2.42 people are deaf
and able to speak. The vision and voice are the main
problem in this people. They only have one median Fig1: Block Diagram
to connect to society that is sigh language. To the
society it is mandatory to understand there sign Above fig shows that the block of the proposed
language. Some time it go’s hard to society to system.it this system we have used computer to
understand there sign language. So the we have recognize hand gestures. The input of computer
design a system which may help society to system is given to through the camera. Which can
understand there sign language. The system has be systems camera or external USB camera. Here
design with computer system and camera. The in the system hand gestures and of human are the
system or algorithm is able to recognize sign input given to camera. Given hand gestures are
language. In this algorithm Image processing has getting recognized by applying image processing
used as to recognize sign. The complete algorithm on to given input to apply image processing we
has developed using python programming have used python machine algorithms have used.
language. Python is one of the best programing
language to do develop complex algorithm. 3.FLOW CHART

1.1 Problem Statement

The language of communication for deaf and dumb


people is sign language. Most of these physically
impaired people are dependent on sign language
translators to express their thoughts to rest of the
world which may make them feel uncomfortable.
This causes these people to isolate from society.
Hence, Sign language recognition is very
important. A sign language is made of various
actions formed by physical movement of body parts
i.e. hand, arms and facial expressions.

1.2. SOLUTIONS

The purpose use of hand gestures for recognition of


Indian sign language. Hand gesture recognition
system user friendly way of interaction with the
computer o any computerize machine which is
Here in the system whole algorithm has developed detection model that operates on the full image and
using python programing language. Python is high returns an oriented hand bounding box.
level, interpreted programming language which is
widely used develop complex machine learning
algorithm deep learning algorithms using varies
packages. Here in the system we have used
bellow’s python packages to develop algorithm. 4.3 PyQt5
4. USED PROGRAMMING LANGUGE PyQt5 is a python package. Used to build GUI
application.
PyQt5’s Qtcore module contains all the class of
GUI design.

5. RELATED WORK

Here in the system whole algorithm has developed


using python programing language. Python is high
level, interpreted programming language which is
widely used develop complex machine learning
algorithm deep learning algorithms using varies
packages. Here in the system we have used
bellow’s python packages to develop algorithm.

4.1 OPEN CV
Several research works have been done in the area of
words prediction using hand gesture recognition. For
instance, Guo and Wang (2017) proposed a novel
method of using hand gesture recognition to predict
words for individuals. Their study utilized a
convolutional neural network (CNN) algorithm to
classify hand gestures, and a Hidden Markov Model
(HMM) to predict words. The proposed method
Open cv computer vision library which is achieved high accuracy and efficiency in predicting
dominated for image processing it also called open words.
source computer vision library. Mainly used for
real time computer vision and image processing. Another research work done in this field was carried out
The main library has design in c ++. Open CV open by Pei et al. (2018). They proposed a real-time hand
source library started in 2011. gesture recognition system that can predict words for
people. Their system employed a deep learning-based
4.2 MEDIA PIPE
framework, which was trained using a large dataset of
Media pipe python library which has design for hand gestures. Their proposed system achieved an
machine learning package which has built in trained accuracy of 97.23% and a response time of 0.12
data set in tensor flow. it can form the basis for sign seconds.
language understanding and hand gesture control,
and can also enable the overlay of digital content and Similarly, Jiang and Li (2018) proposed a hand gesture
information on top of the physical world in recognition system that uses machine learning
augmented reality. Media pipe most perfect solution algorithms to predict words for individuals. Their
for the hand and finger tracking solution. It employs system used a combination of hand feature extraction
Deep learning to infer 21 3D landmark on hand. and gesture classification techniques to recognize hand
MediaPipe Hands utilizes an ML pipeline consisting gestures, and a support vector machine (SVM)
of multiple models working together: A palm algorithm to predict words. Their proposed system
achieved an accuracy of 95.2%.
Furthermore, a research work done by Zhang et al. have several devices connected to it. So to correctly
(2019) proposed a system that can recognize hand detect the camera we should get the id of the device.
gestures and translate them into spoken words for the For that type command ‘imaqhwinfo (‘winvideo’)’
deaf. Their system utilized a combination of machine in the command window of MATLAB. The frames
learning algorithms, including the HMM and SVM per trigger and frame grab intervals are specified in
algorithms, to recognize hand gestures and translate the program. When the specified frame is acquired
them into spoken words. Their proposed system the image is captured with webcam. The software
achieved an accuracy of 92.7%. code and supporting tools used are based on the
leading software in the field, MATLAB and the
In addition, Wang and Chen (2020) proposed a system Image Processing Toolbox from The Math Works.
that can recognize hand gestures and translate them into
MATLAB is a high-performance language for
written words for the deaf. Their system utilized a
technical computing which is very popularly used
convolutional neural network (CNN) algorithm to
nowadays. It integrates computation, visualization
recognize hand gestures and a deep learning-based
and programming in an easy-to-use environment
model to translate them into written words. Their
where problems and solutions can be easily
proposed system achieved an accuracy of 96.78%. expressed in mathematical notations. The basic
commands used are given in “table 1”

6. DATA FLOW DIAGRAM


TABLE 1: The Basic Commands Used

COMMANDS FUNTIONS
Imread To read an image

Imshow To display an image


Imaqreset To reset the camera
rgb2gray To convert a colour image
to gray scale image

gray2rgb To convert a gray scale


image to colour image

8.TESTING AND TRAINING

The matrices of the gesture captured (the gestures


captured are converted into matrix format) are
compared with matrices in the database by
correlation operation. The image processing
7. IMAGE CAPTURING
consists of mainly two steps, training and testing.
The image acquisition is the basic process in the The training step deals with database creation. The
project. An integrated or external web camera is image of the gesture captured is pre-processed by
used to capture the hand gestures. These images are changing the brightness, contrast, sharpness etc.
used in the further processes in the system. The After that, feature is extracted from the image. Here
image is captured using the image processing the feature extracted is RGB color space values of
toolbox in the MATLAB. Before starting the the glove worn in the image. The image in the
programming, we should get the information about matrix format is loaded to the database.
the camera that is connected to the computer. For
that ‘imaqhwinfo’ (image acquisition hardware
information) command is used. Each adapter may
Like wise all the gestures are loaded to the
database. Testing step also has the steps like image
acquisition, pre-processing, feature extraction.
10.CONCLUSION:
After the feature is extracted, the matrix obtained is
compared with that in the database using In this paper we have done hand gesture recognition
correlation operation. The technology used here is using python. The language of communication for
much more complicated than the existing, but it can deaf and dumb people is sign language. Most of
ensure more accuracy than the others. these physically impaired people are dependent on
sign language translators to express their thoughts
9. RESULT
to rest of the world which may make them feel
uncomfortable. So we have developed a algorithm
which can recognize deaf, dumb people body
language and gestures.

REFRENCES:

[1].https://google.github.io/mediapipe/solutions/ha
nds

[2]J. Clerk Maxwell, A Treatise on Electricity and


Magnetism, 3rd ed., vol. 2. Oxford: Clarendon,
1892, pp.68–73.

[3]I. S. Jacobs and C. P. Bean, ―Fine particles, thin


films and exchange anisotropy,‖ in Magnetism, vol.
III, G. T. Rado and H. Suhl, Eds. New York:
Academic, 1963, pp. 271–350.

[4]K. Elissa, ―Title of paper if known,‖


unpublished.

[5]R. Nicole, ―Title of paper with only first word


capitalized,‖ J. Name Stand. Abbrev., in press.

You might also like