100% found this document useful (1 vote)
1K views68 pages

Virtual Mouse Final Report-1

Report on virtual mouse

Uploaded by

Aafiya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
1K views68 pages

Virtual Mouse Final Report-1

Report on virtual mouse

Uploaded by

Aafiya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 68

Anjuman-I-Islam’s

M.H.Saboo Siddik Polytechnic


8, M.H.Saboo Siddik Polytechnic Road, Mumbai 400008

FINAL YEAR DIPLOMA IN COMPUTER ENGINEERING

(2020-2021)

PROJECT REPORT ON

VIRTUAL MOUSE USING HAND GESTURE

BY

220443-SAKSHI RALE

220446-AAFIYA SAYYED

230483-MAHEK FODKAR

UNDER THE GUIDANCE OF

MS. ZAIBUNNISA MALIK

Maharashtra State Board of Technical Education (MS-BTE)

Mumbai (Autonomous) (ISO 9001:2008) (ISO/IEC 27001:2005

1|Page
Anjuman-I-Islam’s
M.H.Saboo Siddik Polytechnic
8, M.H.Saboo Siddik Polytechnic Road, Mumbai 400008

Certificate

This is to certify that Mr./Ms. SAKSHI NAVANATH RALE from Computer

Engineering Department of M. H. Saboo Siddik Polytechnic, Mumbai having

Enrollment No. 2200020350 has completed Final Project Report having Title

VIRTUAL MOUSE USING HAND GESTURE during the academic year 2024 –

2025 in a group consisting of 3 persons under the guidance of Faculty Guide

Ms.Zaibunnisa Malik & Co Guide Ms.Munira Ansari.

Place: Mumbai Sign of Guide:

_____________

2|Page
Date: ____________ Sign of HOD: _____________

3|Page
INDEX
Obtained
Sr.No. Title Marks
Marks Faculty Sign with Date

1. Problem Identification 02

Industrial Survey &


2. 02
Literature Review

3. Project Proposal 03

4. Execution of Plan 02

5. Final Project Report 06

6. Project Log Book 02

7. Project Portfolio 04

8. Presentation & Defence 04

Total 25

4|Page
Project
Report

5|Page
Acknowledgement

It is our esteemed pleasure to present the project report on


“VIRTUAL MOUSE USING HAND GESTURE”

We would firstly like to thank our Principal (I/c), Head of the Department & Guide
Ms. Zaibunnisa Malik for encouraging and motivating us with her guidance and
total support for our work. We shall also like to thank Ms. Munira Ansari for
working as our sub guide and making our path to integrity much simpler.

We also thank all the teachers who constantly motivated us and provided us their
precious knowledge about the procedures carried out for making a project along
with technical knowledge they have availed.

We would also like to thank our principal Mr. A.K Qureshi for providing us this
Opportunity of integrating our own project and constantly supporting us
throughout the process.

It would also be pleasure thanking all the staff, be it teaching or non-teaching


who always understood by us and never made any problem tread our way.

6|Page
Abstract
(One paragraph 150 words)

This project explores the development of a virtual mouse controlled by hand gestures, aiming to
provide an innovative way for users to interact with their computers. Traditional mice can be
challenging for some users, particularly those with physical disabilities. By utilizing advanced
sensors and cameras, our system captures hand movements and translates them into
corresponding mouse actions, such as clicking, dragging, and scrolling.
The project involves creating an intuitive interface that recognizes specific gestures, ensuring
smooth and accurate control. Users can navigate applications, websites, and other digital content
without needing a physical mouse, promoting accessibility and ease of use.
This technology not only enhances user experience but also encourages hands-free interaction,
making it suitable for various environments, including homes, offices, and educational settings.
Ultimately, our goal is to make computing more inclusive and efficient, enabling everyone to
engage with technology seamlessly.

7|Page
Table of Content

Sr. No. Chapter Page No.

Introduction and Background

1.1.Introduction

1.2.Background

1.3.Motivation

1.4.Problem Statement
1.
1.5.Objective and Scope

1.6.Advantages

1.7.Disadvantages

1.8.Limitations

1.9.Conclusion

Literature Survey

2.1.Introduction
2.
2.2.Research Papers

2.3.References

2.4.Conclusion

8|Page
Sr. No. Chapter Page No.

Proposed Methodology

3.1.System Design

3.1.1.Introduction

3.1.2.Block Diagram

3.1.3.System architecture diagram


3.
3.1.4.Data Flow Diagram

3.1.5.Software Design Approach

3.2.Time Line Chart

3.3 Gantt Chart

3.4.Conclusion

9|Page
Chapter 1: Introduction and Background

Content:

1.1. Introduction
1.2. Background
1.3. Motivation
1.4. Problem Statement
1.5. Objective and Scope
1.6. Advantages
1.7. Disadvantages
1.8. Limitations
1.9. Conclusion

10 | P a g e
1.1. Introduction

In today’s tech-driven world, it’s important to create easy and hands-free ways to control digital
devices. The project "Virtual Mouse Using Hand Gesture" aims to let users control a computer’s
cursor just by using hand gestures. This method removes the need for traditional devices and
makes it easier for users to interact with technology in different settings.

Instead of using physical mice or touchpads, the Virtual Mouse project uses image recognition to
detect and understand hand movements in real time. By recognizing specific gestures, the system
can perform common tasks like moving the cursor, clicking, and scrolling, allowing for a
smooth, contactless experience.

1.2. Background

The "Virtual Mouse Using Hand Gesture" project responds to the need for new and accessible
ways to interact with technology. Traditional devices, like mice and keyboards, can be hard for
some users to use, especially in places that require touch-free operation, like hospitals or smart
homes.This project uses gesture recognition to create a hands-free alternative to traditional
devices. By translating hand movements into cursor actions, users can navigate their computers
without touching anything. As technology moves towards more user-friendly designs, this
project offers a modern solution for more inclusive tech experiences.

1.3. Motivation

The motivation behind the "Virtual Mouse Using Hand Gesture" project is to provide more
accessible and intuitive ways for people to interact with computers. As demand for touchless
technology grows, there’s a need for alternatives to standard input devices, especially for users
with mobility issues or in situations where contact should be minimized.

This project was inspired by observing how users struggle in certain settings, like healthcare,
where cleanliness is crucial, or for those who need special technology. By creating a hands-free
interface that uses simple gestures, the project aims to improve user experience and promote
accessibility.

1.4. Problem Statement

Traditional devices like mice and keyboards can be hard to use without hands. In places where
touch-free interaction is necessary—like sterile environments, for people with mobility
challenges, or in smart homes—traditional methods may not work well. Also, standard devices
often lack the adaptive interaction that users expect today.

11 | P a g e
The "Virtual Mouse Using Hand Gesture" project aims to solve these problems by offering a
gesture-based interface that allows users to control a computer's cursor easily. By using gesture
recognition technology, this project provides a touch-free experience that meets the needs of
various users and settings, improving accessibility and usability.

1.5. Objective & Scope

 Objective
 To study existing methods for recognizing hand gestures.
 To gather and analyze data on common hand gestures for cursor actions (like moving,
clicking, scrolling).
 To use machine learning to interpret hand gestures accurately.
 To create a virtual mouse system that converts hand gestures into real-time cursor
movements.

 Scope
This project will explore new ways to interact through gesture recognition and real-
time processing. It aims to create a hands-free, user-friendly tool for controlling
digital devices, overcoming the limitations of traditional input methods. The virtual
mouse system is designed for various uses, from helping people with limited mobility
to providing touch-free control in clean or specialized environments, leading to a
more adaptable and inclusive tech experience.
1.6. Advantages

 Enhanced Accessibility: Makes it easier for users with limited mobility to use a touch-free
interface.

 Engaging User Experience: Offers an interactive tool that appeals to tech-savvy users.

 Contactless Interaction: Promotes hygiene by eliminating physical contact, especially useful


in public spaces.
 Faster Navigation: Allows quick, intuitive gestures for smoother navigation and improved
efficiency.

 Inclusive Design: Supports a wide range of users, including younger students and those with
different accessibility needs.

 Innovative Appeal: Provides a modern approach that stands out from traditional career
guidance tools.

12 | P a g e
1.7. Disadvantages

 Lack of Adaptation for Unique Circumstances: Some users may have specific needs that
the gesture system may not effectively meet.

 User Device and Connectivity Constraints: Not all users have access to the necessary
devices or reliable internet, limiting the technology’s reach.

 Data Quality and Bias: The system’s performance depends on the quality of training data;
biased or insufficient data can lead to inaccuracies.

 Lack of Human Element: A virtual mouse may miss the personal interaction that can
enhance user experience.

 Inflexibility: The system may not cater to unique user preferences, reducing its usefulness for
some.

 Dependency on Technology: Users might rely too much on gesture technology and struggle
with traditional input methods later on.

1.8. Limitations

Not for Everyone: Some users, particularly those with disabilities, may find gesture systems
difficult to use.

Device and Internet Issues: Users may lack the necessary devices or stable internet access,
hindering their ability to use the technology.

Data Problems: The system functions best with high-quality data. If the data is flawed or
biased, gesture recognition may suffer
.
Missing Personal Touch: Technology may lack the personal connection that comes from
talking to a real person, important for understanding user needs.

Rigidity: Some users may have preferences that the gesture system cannot adapt to easily.

Too Much Dependence: Users might become overly reliant on the technology, making
traditional methods harder to use later.

13 | P a g e
1.9. Conclusion

The virtual mouse project using hand gestures is an important step forward in how we interact
with computers. However, it’s important to recognize its limitations, including accuracy issues
with gesture recognition, privacy concerns about data collection, and the absence of personal
interaction.

To make the most of this project, it should be seen as one tool among many for controlling a
computer. The human aspect remains crucial, especially in understanding individual user needs.
Balancing tech-driven interactions with personalized support is vital for a good user experience.

Continuous improvements to address these limitations will be essential for the project’s success
and its ability to meet the varied needs of users in our fast-evolving tech landscape.

14 | P a g e
15 | P a g e
Chapter 2: Literature Survey

Content:

2.1. Introduction
2.2. Research Papers
2.3. References
2.4. Conclusion

16 | P a g e
2.1. Introduction
{ content font name=”Times New Roman”, Size= 12, style=regular}

2.2. Research Papers

Paper Title 1: Volume Control feature for gesture recognition in Augmented and Virtual reality
applications
Author: Shruti Kansal
Published in: IEEE 2023
Abstract: This paper presents a gesture-based volume control system designed specifically for
Virtual and Augmented Reality applications. By utilizing custom gestures for interaction, the
system demonstrates effective recognition and control of volume settings in real-time. While the
results show promise, the study identifies limitations in gesture vocabulary and contextual
challenges during varied user scenarios. Future work aims to broaden the range of gestures and
explore additional applications within VR/AR environments.

Paper Title 2: Hand Gesture Recognition using Shape-based Image Features for Music
Controller
Author: Chitra R., Allam Prathyusha Reddy, Anusha Bamini A.M, Naru Kezia
Published in: IEEE (ICOSEC) 2022
Abstract: This research develops a real-time hand gesture recognition system tailored for
controlling music playback functions. By detecting key shape-based features such as finger
orientation and posture, the system enables users to interact with music players seamlessly.
Despite its effectiveness in recognizing specific gestures, the system encounters challenges in
various environments, particularly with background noise and lighting conditions. Future
directions include expanding the gesture set to accommodate diverse user interactions and
improving robustness across different settings.

Paper Title 3: Volume Control feature for gesture recognition in Augmented and Virtual reality
applications
Author: Shruti Kansal, H Venkateswarareddy, Shanmugasundaram Hariharan, Sasi Kala Rani
K, Andraju Bhanu Prasad, Ponmalar A
Published in: IEEE (InC4) 2023
Abstract: This paper explores advanced gesture recognition techniques focused on volume
control using palm detection and hand tracking within VR/AR contexts. The implementation
utilizes OpenCV, Pycaw, and MediaPipe libraries, showcasing promising real-time performance.
However, the study reveals challenges related to noise sensitivity in video streams and the lack

17 | P a g e
of standardized gestures. Future applications could extend to various devices, including TVs and
projectors, while enhancing gesture recognition algorithms for improved accuracy.

Paper Title 4: An Approach to Control the PC with Hand Gesture Recognition using Computer
Vision Technique
Author: Kirti Aggarwal, Anuja Arora
Published in: IEEE (INDIAcom) 2022
Abstract: This research introduces a novel hand gesture recognition system that facilitates the
control of virtual mouse and keyboard functions with an impressive accuracy of 95%. Utilizing a
webcam for real-time gesture recognition, the study enhances human-computer interaction (HCI)
by allowing intuitive control without physical devices. Despite its success, the system's
performance is heavily dependent on webcam quality and environmental conditions. Future work
aims to incorporate AI-driven classifiers to boost accuracy and reduce reliance on hardware
quality.

Paper Title 5: Hand Gesture Recognition using Shape-based Image Features for Music
Controller
Author: Chitra R., Allam Prathyusha Reddy, Anusha Bamini A.M., Naru Kezia, Brindha D,
Gyara Beulah
Published in: 2022
Abstract: This paper details the development of a gesture recognition system designed to control
music playback through hand movements. By focusing on shape-based image features, the
system captures gestures in real-time using a webcam. Although it demonstrates effective gesture
recognition, the study highlights limitations regarding the dataset and the impact of real-world
variables such as lighting and gesture diversity. Future enhancements are proposed to include
more complex gestures and improve performance under diverse conditions.

Paper Title 6: Design of Human-Computer Interaction Control System Based on Hand-Gesture


Recognition
Author: Wang Zhi-heng, Cao Jiang-tao, Liu Jin-guo, Zhao Zi-qi
Published in: IEEE 2017
Abstract: This paper presents an innovative Human-Computer Interaction (HCI) control system
that employs an improved PSO-SVM algorithm for gesture recognition. By optimizing SVM
kernel parameters using Particle Swarm Optimization, the system achieves recognition rates
between 85% and 100%. While the research addresses some limitations of traditional PSO
algorithms, it suggests further exploration of gesture recognition applications in real-world
scenarios. Future research directions include broader implementations in robotics and interactive
systems.

Paper Title 7: Gesture Recognition Based Virtual Mouse and Keyboard


Author: Sugnik Roy Chowdhury, Sumit Pathak, M.D. Anto Praveena
Published in: 2020 IEEE (ICOEI)
Abstract: This study explores the development of a virtual mouse and keyboard utilizing hand
18 | P a g e
gesture recognition through image processing techniques. The system effectively maps hand
gestures to mouse clicks and keyboard inputs, facilitating intuitive user interaction without
additional hardware. Although the system shows promise, it faces limitations related to noise
interference and the complexity of gestures. Future directions include the potential for
integration with augmented reality systems for enhanced interaction with 3D objects.

Paper Title 8: Embedded Virtual Mouse System by Using Hand Gesture Recognition
Author: Tsung-Han Tsai, Chih-Chi Huang, Kung-Long Zhang
Published in: IEEE 2015
Abstract: This paper describes an embedded virtual mouse system that utilizes hand gesture
recognition to achieve high accuracy even in challenging environments. Through a series of
tests, the system demonstrates an accuracy range between 82% and 95%. However, it does not
address scalability or performance with complex gestures in varying lighting conditions. Future
research is encouraged to expand the range of recognized gestures and improve the system’s
robustness in diverse real-world applications.

Paper Title 9: Virtual Mouse using Hand Gesture and Color Detection
Author: Amardip Ghodichor, Binitha Chirakattu
Published in: IEEE 2015
Abstract: This research presents a virtual mouse system controlled by hand gestures detected
through color markers. The system effectively achieves precise cursor movement but is heavily
dependent on camera resolution and lighting conditions. While promising, it does not explore
how to handle varying environmental factors. Future developments aim to enhance robustness
against these variables and integrate multimedia functionalities using gesture controls.

Paper Title 10: Design and Development of Hand Gesture Based Virtual Mouse
Author: Kabid Hassan Shibly, Samrat Kumar Dey, Md. Aminul Islam, Shahriar Iftekhar
Showrav
Published in: IEEE 2019
Abstract: This paper discusses the design of a virtual mouse system that enables users to control
mouse functions through hand gestures detected via color detection techniques. The system
achieves an accuracy range of 78-91% on simple backgrounds but significantly drops to 40-42%
in complex environments. The research highlights the need for improvements in performance
across varied lighting and background conditions. Future enhancements may include additional
functionalities such as window manipulation using new gesture inputs.
Paper Title 11: Real-time Virtual Mouse System using RGB-D Images and Fingertip Detection
Author: Dinh-Son Tran, Ngoc-Huynh Ho, Hyung-Jeong Yang, Soo-Hyung Kim, Guee Sang Lee
Published in: SPRINGER 2020
Abstract: This study presents a virtual mouse system that utilizes RGB-D images for real-time
fingertip detection and tracking, allowing users to control a virtual cursor on the screen without
the need for gloves or markers. The system was tested under various conditions, including
different screen resolutions and complex backgrounds. Utilizing hand detection and
segmentation techniques, the system performed well across different lighting levels and
19 | P a g e
distances, although performance dropped significantly when tracking multiple users, with
accuracy decreasing to 53.35% for six people. Future work aims to expand the system’s
applications to smarter environments and incorporate a wider range of gestures for enhanced user
interaction.

Paper Title 12: A Virtual Mouse for Controlling Laptops in a Smart Home
Author: Clerc-Manne Taing, Pei-Luen Patrick Rau, Hanjing Huang
Published in: Journal Article BY SPRINGER 2017
Abstract: This research introduces Handpad, a virtual mouse system designed specifically for
smart homes, which integrates touchscreen technology with hand movements to control devices.
User testing with 31 participants in typical home settings showed a high acceptance rate, with
users praising the ease of use and practicality. However, performance expectations were
moderate, potentially due to the hedonic nature of tasks involved. The study suggests further
development for seamless integration into smart home ecosystems and additional research to
refine interaction techniques.

Paper Title 13: A Virtual Mouse System Using Finger-Gestures of Twisting-in


Author: Takashi Kihara, Makio Ishihara
Published in: SPRINGER 2011
Abstract: This paper develops a virtual mouse system that uses a unique twisting finger gesture
to simulate traditional mouse interactions. Through experiments involving five participants
performing tasks with this system compared to a regular mouse and touchpad, the study
highlights the intuitive nature of twisting and pushing gestures. However, it also notes that task
completion times were longer due to processing delays. Future work will focus on improving
image processing speed through CUDA technology to enhance usability for more complex tasks.

Paper Title 14: A Real-time Hand Gesture Recognition and Human-Computer Interaction
System
Author: Pei Xu
Published in: 2017
Abstract: This research introduces a real-time gesture-based human-computer interaction (HCI)
system that employs a convolutional neural network (CNN) for gesture recognition. A dataset of
19,852 images representing 16 different gesture types was collected for training. The system
achieved an impressive gesture recognition accuracy of 99.8% and facilitated smooth mouse
control via a Kalman filter for tracking. Despite its strengths, the system only supports static
gestures and faces challenges in background removal and transient gesture detection. Future
improvements aim to include dynamic gesture recognition and enhance complex HCI and
human-robot interaction capabilities.

Paper Title 15: Gesture Recognition Based Mouse Events


Author: Rachit Puri
Published in: 2017
Abstract: This paper presents a gesture-based mouse control system that utilizes color caps for
20 | P a g e
finger tracking. The system employs RGB to YCbCr color space conversion, implemented in
MATLAB, to facilitate smooth mouse events. While the approach allows for effective control,
the accuracy is influenced by distance from the camera, and the system is limited to static
gestures. Future developments aim to reduce lag and add features such as zoom functionality and
system shutdown options for enhanced usability.

Paper Title 16: Hand Gesture Recognition System to Control Soft Front Panels
Author: H. Renuka, B. Goutam
Published in: 2014
Abstract: This research develops a dynamic hand gesture recognition system that enables users
to control media player operations through intuitive hand movements. A dataset comprising five
hand gestures was created, with 40 images per gesture. The system effectively recognizes
gestures and executes corresponding operations, such as play and pause, based on the number of
matched patterns. While successful for static gestures, it lacks real-time recognition capabilities
and is sensitive to environmental factors like lighting conditions. Future work will focus on
implementing real-time gesture recognition capabilities.

Paper Title 17: Hand Gesture Recognition System Using Camera


Author: Viraj Shinde, Tushar Bacchav, Jitendra Pawar, Mangesh Sanap
Published in: 2014
Abstract: This study introduces a hand gesture recognition system designed for human-
computer interaction (HCI). Using camera-based image acquisition, preprocessing, and feature
extraction techniques based on an adaptive color HSV model and motion history images, the
system shows promise. However, it currently lacks real-time capability and is sensitive to
lighting variations. Future enhancements will aim to implement real-time recognition and
improve robustness against environmental factors, while exploring advanced machine learning
techniques for better accuracy.

Paper Title 18: Hand Gesture Recognition Based Presentation System


Author: Akshaya Ramachandran, Raksha Aruloli, Taran Akshay
Published in: Not specified
Abstract: This paper describes a system that successfully utilizes hand gestures to control
PowerPoint presentations, providing a more natural and engaging way to navigate slides
compared to traditional input devices. The system uses OpenCV for video capture and a hand-
tracking module for gesture detection. Although accuracy metrics are not explicitly provided, the
system is described as functioning effectively for basic navigation and annotation tasks. Future
improvements may include incorporating more gestures, adding a voice interface, and enhancing
cross-platform compatibility.

Paper Title 19: Hand Control Using Gesture Recognition


Author: Sakshi Sharma, Shubhank Sharma, Piyush Yadav
Published in: Not stated
Abstract: This research presents a gesture-controlled robotic hand capable of picking and
21 | P a g e
placing objects through intuitive hand gestures. Using components like Arduino, flex sensors,
accelerometers, and Bluetooth technology, the system demonstrates accurate and precise
movements. However, there is a noted need for more intuitive control mechanisms, particularly
for use in hazardous environments. Future directions will focus on incorporating more complex
gestures to expand the system's capabilities.

Paper Title 20: Robotic Hand Control Using Gesture Recognition


Author: Sakshi Sharma, Shubhank Sharma, Piyush Yadav
Published in: Not stated
Abstract: This study develops a gesture-controlled robotic hand that allows users to pick and
place objects using hand gestures. The system employs similar technology to other gesture
recognition systems, including Arduino and various sensors. While demonstrating accurate and
precise movements, the system requires intuitive control mechanisms to enhance usability in
challenging environments. Future work will focus on improving gesture recognition accuracy
and expanding the range of supported gestures.
21. Enhanced AI-Based Virtual Mouse Using Hand Gesture Recognition
Authors: Dhananjay Rathod, Sujal Shinde, Pronit Ghosh, Karthika Thevar, Sangita Bhoyar
Published in: IJRPR, March 2023
Abstract: This study presents a virtual mouse system that allows users to control a computer
mouse using hand gestures instead of a physical mouse. The system uses artificial intelligence to
improve accuracy, showing better performance than existing models. However, the authors
found some issues, particularly with functions like right-clicking and selecting text by dragging.
Future improvements could enhance the system's ability to detect gestures more accurately,
especially in these challenging areas.

22. Ergonomic Virtual Mouse System for Users with Disabilities


Authors: Meenatchi R., Nandan C., Swaroop H.G., Varadharaju S.
Published in: International Journal of Current Science (IJCSPUB), May 2023
Abstract: This paper discusses a virtual mouse system that uses hand gestures to provide a more
natural way to interact with computers. It aims to help people with disabilities and reduce
dependence on traditional mouse devices. The system has shown promising results in real-life
situations, including gaming and virtual reality. However, it does face challenges, such as
tracking errors and sensitivity to lighting conditions. The authors suggest that improving the
system's technology could make it even more reliable.

23. AI-Enhanced Gesture-Controlled Mouse Interface


Authors: Kavitha R., Janasruthi S. U., Lokitha S., Tharani G.
Published in: 2023
Abstract: This research introduces a virtual mouse system that uses artificial intelligence to
recognize hand gestures for controlling the computer cursor. This system not only makes it easier
for people with physical disabilities to use a computer but also includes voice commands for
even smoother interaction. The authors believe this technology could be beneficial in various

22 | P a g e
fields like gaming and virtual reality. They mention that while the system works well, there are
still areas to improve, especially regarding gesture recognition in different environments.

24. Real-Time Gesture Recognition for Virtual Mouse Control


Authors: Dr. G. Krishna Kishore, M. Dhyanesh, R. Vijay Kumar, MD. Ameena, M. Surya Teja,
R. Vijay Krishna
Published in: IJFMR, 2023
Abstract: This paper presents a virtual mouse system that uses hand gesture recognition to
control the cursor without needing physical input devices. The authors explain how the system
captures real-time video from a webcam to detect hand movements. Although the system has
shown smooth and responsive performance, it sometimes struggles in low-light conditions. The
authors emphasize the importance of improving gesture recognition technology to enhance user
experience, especially in different lighting situations.

25. Comprehensive Gesture-Based Control for Virtual Mouse Operations


Authors: G. N. Srinivas, S. Sanjay Pratap, V. S. Subrahmanyam, K. G. Nagapriya, A. Venkata
Srinivasa Rao
Published in: IJRET, February 2023
Abstract: This study introduces a gesture recognition system that enables various mouse
functions through real-time webcam input. The authors describe how users can achieve cursor
movements, clicks, and other actions simply by using hand gestures. The system has been
effective in different lighting conditions, but the authors note that more complex gestures may
pose challenges. They recommend adding more advanced gestures and improving the system's
adaptability to changing environments.

26. Gesture-Controlled Mouse System: Performance Evaluation


Authors: Bharath Kumar Reddy Sandra, Katakam Harsha Vardhan, Ch. Uday, V. Sai Surya,
Bala Raju, Dr. Vipin Kumar
Published in: Irjmets, April 2022
Abstract: This paper evaluates a gesture-based virtual mouse system using a dataset of hand
gestures. The authors conducted experiments with various participants to assess the system's
performance. While the system achieved high accuracy in many mouse functions, it struggled
with right-click actions. The authors conclude that further improvements are needed, particularly
in enhancing gesture detection algorithms to support more complex actions.

27. Advancements in Hand Gesture Recognition for HCI


Authors: C. N. Sujatha, S.P.V Subbarao, P. Preetham, P. Surajvarma, Y. Upendra
Published in: IJCRT, June 2022
Abstract: This research explores how hand gesture recognition can improve human-computer
interaction (HCI) by making technology more accessible. The system operates in real-time, using
video input to recognize gestures without needing specific datasets. The authors highlight the
good performance of the system under various lighting conditions, while also noting the
importance of further enhancements for better gesture recognition and user experience.
23 | P a g e
28. Stability and Performance of Gesture-Controlled Virtual Mouse
Authors: Shudhanshu Ranjan, Sumit Anand, Vinayak Gontia, Aman Chandra, Vineet Sharan,
Spandana SG
Published in: IJERCSE, November 2022
Abstract: This paper discusses a virtual mouse system controlled by hand gestures. The authors
emphasize the system's improved stability and accuracy compared to earlier versions. While it
effectively recognizes gestures, there are still limitations in detecting complex gestures and
adapting to different users' needs. The authors recommend optimizing the system to handle
varying lighting conditions better and expanding gesture recognition capabilities.

29. CNN-Based Hand Gesture Recognition for Computer Control


Authors: Pradnya Kedari, Shubhangi Kadam, Rajesh Prasad
Published in: IJRET, June 2022
Abstract: This study introduces a hand gesture recognition system based on Convolutional
Neural Networks (CNN) that allows users to control computers with gestures. The authors
trained the system on a small set of ten gestures for different computer functions. Although the
system achieved high accuracy in recognizing these gestures, the authors suggest that expanding
the range of gestures would enhance the system's overall functionality and usability.

30. Development of a Hand Gesture-Controlled Virtual Mouse


Authors: Mr. E. Sankar, B. Nitish Bharadwaj, A.V. Vignesh
Published in: IJRET, 2023
Abstract: This paper presents a virtual mouse system that uses hand gestures for control,
eliminating the need for traditional input devices. The authors highlight that while the system
performs well in recognizing gestures, there are still challenges related to dragging and clicking
actions. They emphasize the need for improvements in the algorithms used for fingertip detection
to enhance overall accuracy and user experience.

24 | P a g e
2.3. References

1 Kansal, S. (2023). Volume Control Feature for Gesture Recognition Systems. IEEE.
2 Chitra, R., Prathyusha Reddy, A., & Bamini, A. (2022). Hand Gesture Recognition using
Shapebased Image Processing Techniques for Music Control. IEEE (ICOSEC).
3 Shruti Kansal, (2023). Gesturebased Volume Control using OpenCV, MediaPipe, and
Pycaw.
4 Chitra, R., et al. (2022). Real-Time Gesture Recognition for Media Control with Shape-based
Processing.
5 Woo, J.-H., & Kim, J.-H. (2020). Environmental Challenges in Gesture Recognition
Systems.
6 Maiga Chang, et al. (2020). Custom Gesture Datasets for Virtual Control Systems.
7 Ekaterina Kochmar, et al. (2021). Standardization Challenges in Gesture Recognition. 8.
Michael Timms, (2016). Machine Learning for Gesture Interfaces.
9. Vujinovic, A., & Luburić, N. (2024). AI-based Gesture Control for Public Spaces.
10. Hendro Margono, et al. (2024). AI’s Role in Gesture Recognition Usability Studies.
11. Gan, W., et al. (2019). Gesturebased Control Techniques: A Review.
12. Bai, X., & Stede, M. (2022). Advanced Machine Learning for Gesture Recognition.
13. Prada, M.A., et al. (2020). Engineering Use Cases for Gesture Control.
14. Bassner, P., & Frankford, E. (2024). Touchless Interaction for Smart Devices.
15. Seo, K., et al. (2021). Impact of Lighting Conditions on Gesture Control Systems.
16. Tan, D.Y., & Cheah, C.W. (2021). Developing Virtual Mouse Systems using OpenCV.
17. Kim, W.-H., & Kim, J.-H. (2020). Real-Time AI Systems for Gesture Recognition.
25 | P a g e
18. Lin, C.-C., et al. (2023). Gesture Systems for Sustainable Computing Environments.
19. Brue, R., et al. (2024). Haptic Feedback in Gesture-based Systems.
20. Gan, W., & Sun, Y. (2019). Adaptive Interfaces for Gesture Control.
21. Morris, W., et al. (2024). Evaluating Performance Metrics for Virtual Mouse Systems.
22. Seo, K., & Yoon, D. (2021). AIbased Gesture Systems for Online Learning.
23. Yang, D.Y., & Nagashima, T. (2021). Improving Public
Interfaces with Gesture Controls.
24. Arnau-González, P., et al. (2023). Natural Language and Gesture Systems.
25. Margono, H., et al. (2024). Analyzing Gesture Control in Educational Settings.
26. Bassner, P., et al. (2024). Gesture Control for Software Development Tools.
27. Timms, M.J. (2016). Smart Classrooms and Gesture-based Interfaces.
28. Nye, B.D. (2014). Global Trends in AI-based Gesture Systems.
29. Minn, S. (2022). AI-assisted Knowledge and Gesture Systems.
30. Arnau, D., et al. (2023). Integrating Gesture Recognition into Learning Systems.

2.4. Conclusion
The research on hand gesture recognition systems for controlling virtual mice shows great
promise for improving how we interact with computers. These systems can help people,
especially those with physical disabilities, use computers more easily by allowing them to
control the mouse with their hands instead of a physical device.
While the technology is exciting, there are still some challenges. Many studies noted that the
accuracy of recognizing gestures can be an issue, especially in different lighting or with more
complex movements. There is also a need to add more gestures to make the system even more
useful.

26 | P a g e
Chapter 3: Proposed Methodology

Content:

3.1. System Design

3.1.1. Introduction

3.1.2. Block Diagram

3.1.3. System architecture diagram

3.1.4. Data Flow Diagram

3.1.5. Software Design Approach

3.2. Time Line Chart

3.3. Gantt Chart

3.4. Conclusion

27 | P a g e
3. System Design .

3.1.1. Introduction

The virtual mouse using hand gesture technology aims to provide an intuitive interface for users
to interact with their devices. This system utilizes sensors, machine learning algorithms, and
computer vision techniques to interpret hand movements and gestures, allowing users to control
a cursor on the screen without physical contact

3.1.2. Block Diagram


A block diagram is a visual representation of a system that uses simple, labeled blocks that
represent single or multiple items, entities or concepts, connected by lines to show relationships
between them. An entity relationship diagram (ERD), one example of a block diagram,
represents an information system by showing the relationships between people, objects, places,
concepts or events within that system. Block diagrams are used heavily in engineering and
design of diagrams for electronics, hardware, software and processes. Most commonly, they
represent concepts and systems in a higher level, less detailed overview. The diagrams are useful
for troubleshooting technical issues

INPUT PROCESSING OUTPUT

Capture Hand Detecting hand Gesture


movment data gesture Command
executed

3.1.3. System architecture diagram

28 | P a g e
Detectin hand
Capture hand Process gesture for
user gesture performing the
movement
command following
command

Dataset Result
(succeflly executed
gesture movment)

3.1.4. Data Flow Diagram

Also known as DFD, Data flow diagrams are used to graphically represent the flow of data in a
business information system. DFD describes the processes that are involved in a system to
transfer data from the input to the file storage and reports generation. Data flow diagrams can be
divided into logical and physical. The logical data flow diagram describes flow of data through a
system to perform certain functionality of a business. The physical data flow diagram describes
the implementation of the logical data flow

29 | P a g e
3.1.5. Software Design Approach

There are many models/approaches that can be followed while developing the project. Some of
the available models are such as V-Model, Incremental Model, Spiral Model, Waterfall Model,
RAD Model, Agile Model . While implementing our project we decided to choose waterfall
model

Agile Model

The meaning of Agile is swift or versatile."Agile process model" refers to a software


development approach based on iterative development. Agile methods break tasks into smaller
iterations, or parts do not directly involve long term planning. The project scope and
requirements are laid down at the beginning of the development process. Plans regarding the
number of iterations, the duration and the scope of each iteration are clearly defined in advance.

Each iteration is considered as a short time "frame" in the Agile process model, which typically
lasts from one to four weeks. The division of the entire project into smaller parts helps to
minimize the project risk and to reduce the overall project delivery time requirements. Each
iteration involves a team working through a full software development life cycle including
planning, requirements analysis, design, coding, and testing before a working product is
demonstrated to the client.
30 | P a g e
.

3.2. Time Line Chart

3.3. Gantt Chart

Gantt Chart 1
Problem Identification
Industrial Survey & Literature Review
Project Proposal
Project Report
Task

Presentation
Project Logbook
Project Portfolio
0 7 14 21 28 35 42 49 56 63 70 77 84 91 98 105 112
Start on day

Task Start on day Duration


31 | P a g e
Problem Identification 0 15
Industrial Survey & Literature
Review 16 21
Project Proposal 0 40
Project Report 0 45
Presentation 45 15
Project Logbook 0 112
Project Portfolio 50 112

3.4. Conclusion

Hence, waterfall model was selected and diagrams of our project was designed. We successfully
made the research and completed basics of our project and soon will commence the
implementation of project

32 | P a g e
Project
Proposal

33 | P a g e
VIRTUAL MOUSE USING HAND GESTURE

Rationale
A virtual mouse using hand gestures offers a modern, hands-free way to control computers,
eliminating the need for traditional hardware like a physical mouse. This technology enhances
convenience by allowing users to interact with their devices through natural hand movements,
making tasks easier and more intuitive. It also provides significant benefits for individuals with
disabilities, offering an accessible alternative to conventional input devices that may be
challenging to use.

Another advantage is that using hand gestures can reduce physical strain caused by prolonged
use of a regular mouse, providing a more ergonomic experience. Additionally, this touchless
control system is ideal in settings like hospitals, where maintaining hygiene is essential, as it
minimizes the need for direct contact with devices.

The virtual mouse aligns with future trends, as gesture-based interaction is becoming
increasingly popular in technologies such as virtual and augmented reality. Developing this
system ensures that users stay updated with modern control methods. Furthermore, the virtual
mouse can work across multiple devices, including PCs, laptops, and tablets, making it flexible
and convenient for users in different environments.

Customization is another important feature. Users can adjust gestures according to their
preferences, ensuring a comfortable and user-friendly experience. With real-time gesture
detection and smooth performance, the virtual mouse ensures efficient operation, enhancing the
overall user experience.

In conclusion, a virtual mouse using hand gestures offers a practical, accessible, and future-ready
solution that combines ease of use, flexibility, and modern technology, making it a valuable
innovation.

34 | P a g e
Introduction
A virtual mouse using hand gestures allows users to control their computer without a physical
mouse by detecting hand movements in real time. This hands-free technology is convenient,
reduces physical strain, and is especially helpful for people with disabilities. It’s also useful in
places like hospitals, where avoiding touch is important. As gesture controls gain popularity in
virtual and augmented reality, the virtual mouse works on various devices like PCs and laptops,
making it a flexible and modern solution.

• Purpose
The purpose of the virtual mouse using hand gestures is to give users a hands-free and easy way
to control their computers. It aims to make using computers easier for people with disabilities
and improve convenience by removing the need for a regular mouse. Additionally, it helps
reduce physical strain from long computer use and promotes hygiene in places where touching
devices should be avoided. By using gesture recognition technology, the virtual mouse provides
a modern and flexible solution that works well on different devices, keeping users up-to-date
with new technology trends.

• Scope
The system lets users control computers using hand gestures instead of a mouse. It helps people
with disabilities use technology more easily. The virtual mouse works on personal computers and
laptops for wider use. It uses real-time gesture detection for quick and accurate responses. Users
can customize gestures and settings to fit their needs. The virtual mouse is useful in homes,
offices, schools, and hospitals.

Literature Survey
Research on virtual mice using hand gestures highlights key areas of focus. Studies explore
gesture recognition technology, utilizing computer vision and machine learning to enhance
accuracy. User experience research shows that simple gestures improve satisfaction and ease of
learning. Additionally, virtual mice assist individuals with disabilities and promote hygiene in
public spaces. There is also exploration of gesture control in augmented and virtual reality
(AR/VR) for immersive experiences. Overall, the findings emphasize the potential for innovation
in this field.

35 | P a g e
Problem Definition
The virtual mouse using hand gestures addresses issues with traditional mouse devices, which
can be difficult for many users. Accessibility is a concern for individuals with disabilities who
struggle with standard mice. Long-term use of traditional mice can cause discomfort and strain
on the hands and wrists. Hygiene is important in settings like hospitals, where physical mice can
spread germs. Additionally, traditional controls can be confusing, requiring users to learn
specific movements. With the rise of virtual and augmented reality technologies, there is a
growing need for systems that enable natural interactions. The virtual mouse aims to provide a
more accessible and hygienic way for people to use computers.

Proposed Methodology
The system uses a camera to capture hand movements and analyzes the video to detect specific
gestures like pointing, clicking, and scrolling. After detecting gestures, the system converts them
into commands that the computer can understand, filtering out background noise and recognizing
patterns.

Machine learning is employed to improve accuracy by training on different hand gestures,


allowing the system to adapt to individual gestures over time. It processes gestures instantly,
enabling users to interact smoothly with their devices, ensuring quick actions like clicking and
scrolling.

A simple interface helps users understand how to use the virtual mouse by providing visual
feedback, such as highlighting areas for clicking or scrolling. Users can customize gestures and
settings to fit their preferences, making the experience more personal and easier to use.

• Aim
The aim is to create a virtual mouse that lets users control their computer using hand
gestures. It will be an easy, hands-free alternative to a regular mouse, improve
accessibility, work smoothly in real-time, and be compatible with different devices.

36 | P a g e
• Objective
• Create a hands-free system to control computers without a physical mouse.

• Provide an easy way to interact using hand movements.

• Improve accessibility for people with disabilities.

• Ensure accurate, real-time gesture detection.

• Ensure compatibility with PCs, laptops, and tablets.

• Achieve fast performance for smooth operation.

• Offer customizable gestures and settings.

Resources

• Hardware
• Suitable computer system hardware.

• CPU: A multi-core processor with a clock speed of at least 2.5 GHz is recommended.

• RAM: At least 16 GB of RAM is recommended for running machine learning models.

• User Devices: Users will use the virtual mouse on their own computers, laptops, . It’s
important to make sure the virtual mouse works well on all types of devices and screen
sizes for a good experience..

• Software
Programming Language:Python – Used to write the code for the virtual mouse.

Computer Vision Libraries:

OpenCV – Detects and tracks hand movements using webcam input.

MediaPipe – Identifies hand parts, like fingertips, and tracks gestures

Mouse Control Libraries:


37 | P a g e
PyAutoGUI – Moves the pointer, clicks, and scrolls based on gestures.

Pynput – Another option to control the mouse and keyboard.

Interface (Optional):Tkinter – Creates a simple window for testing or adjusting settings.

Performance Optimization: Multithreading – Runs tasks simultaneously for smooth


performance.

Code Editors:VS Code, PyCharm, or Jupyter Notebook – Used to write and test the code.

38 | P a g e
Industrial
Survey &
Literature
Review

39 | P a g e
VIRTUALMOUSE USING HAND GESTURE

Abstract
The "Virtual Mouse Using Hand Gestures" project offers a unique way to interact with
computers without a physical mouse. By using a camera, it tracks your hand movements and
gestures, allowing you to move the cursor, click, drag, and scroll just by moving your hands in
front of the screen.

This technology is powered by gesture-detecting software. It identifies your hands and interprets
commands based on how you move them, making it intuitive to use. For example, raising your
hand might move the cursor up, while a specific gesture can simulate a click.

Built with popular tools like OpenCV and MediaPipe, this project is designed for various
situations. It’s perfect for public spaces where hygiene is important, as it avoids touching
surfaces. It’s also great for gamers who want a more immersive experience. Additionally, it
provides accessibility for people with disabilities who may struggle with traditional mice.
Overall, this innovative system aims to make computer interaction more convenient and user-
friendly for everyone.

Literature Review

40 | P a g e
Year
Research Gap
Sr.N Name of of Performan Future
Title Findings Datasets Methdology / Your
o Authors public ce Directions
Observations
ation
Volume Control
feature for gesture Limited
Developed a Promising Expand
recognition in gesture
gesture-based Utilized OpenCV, results in gesture
Augmented and Custom vocabulary;
Shruti IEEE volume MediaPipe, and gesture recognition
R1 Virtual reality gestures context-
Kansal 2023 control system Pycaw for recognition capabilities
applications dataset specific
for VR/AR implementation. and and
https://ieeexplore.ie challenges
applications. control. applications.
ee.org/document/10 observed.
263252

Hand Gesture Chitra R.,


Developed a
Recognition using Allam Limited to
real-time Effective Expand to
Shape-based Image Prathyusha Image processing specific
IEEE(IC system for Custom gesture include more
Features for Music Reddy, techniques for gestures;
R2 OSEC) hand gesture gesture recognition gesture types
Controller Anusha detecting shape- challenges in
2022 recognition to dataset for music and other
https://ieeexplore.ie Bamini based features. varied
control music control. applications.
ee.org/document/99 A.M, Naru environments
playback.
52146 Kezia,
R3 Volume Control Shruti IEEE Gesture OpenCV, Image processing Promising Limited Applications in
feature for gesture Kansal, H (InC4) recognition Pycaw, using OpenCV results for standardizatio TVs,
recognition in Venkatesw 2023 techniques for MediaPip and MediaPipe real-time n for hand projectors, IT
Augmented and arareddy, volume e libraries with palm volume gesture industries;
Virtual reality Shanmugas control using used for detection and control via recognition; improved
applications undaram palm hand gesture volume gestures. noise gesture
https://ieeexplore.ie Hariharan, detection and gesture control. sensitivity in recognition
ee.org/document/10 Sasi Kala hand tracking. tracking. video streams. algorithms.
263252 Rani K,
Andraju
Bhanu

41 | P a g e
Prasad,
Ponmalar A
Introduced a Dependent on
Vision-based
hand gesture webcam
An Approach to gesture
recognition quality, and Use AI for
Control the PC with recognition
system using background trained
Hand Gesture system using 95%
Kirti IEEE Computer Real-time colors can classifiers to
Recognition using OpenCV and accuracy in
Aggarwal, (INDIA Vision to images cause enhance
R4 Computer Vision Python, object controlling
Anuja com)2 control a from a recognition recognition
Technique tracking for cursor and
Arora 022 virtual mouse webcam errors; accuracy
https://ieeexplore.ie control via color click events
and keyboard performance without extra
ee.org/document/97 detection, virtual
with 95% is lower with hardware
63282 keyboard
accuracy, lower PC
implemented
improving HCI. configurations
R5 Hand Gesture Chitra R., 2022 Developed a Custom Used image Achieved Limited Expanding to
Recognition using Allam system for image processing real-time dataset; support more
Shape-based Image Prathyusha recognizing dataset of techniques like gesture additional complex
Features for Music Reddy, hand gestures hand background recognition real-world gestures and
Controller Anusha to control gestures subtraction, for music challenges improve
Bamini music captured binary image control (like performance
A.M., Naru playback by via conversion, skin with simple illumination in variable
Kezia, detecting key webcam. tone detection gestures. variation and real-world
Brindha D, shape-based with the HSV The system gesture conditions
Gyara features like model, and finger can diversity) are (e.g., lighting,
Beulah finger counting. recognize not fully clutter).
orientation Developed a gestures addressed.
and posture. custom algorithm with basic
Intended for for contour-based classificatio
multitasking gesture detection n and
environments, and used point accuracy.
it allows pattern matching
control of for gesture
music (play, recognition.
pause,
42 | P a g e
forward) with
simple hand
movements
captured
through a
webcam.
Proposed a
Human-
Computer Gesture Improved PSO-
Traditional
Interaction samples SVM Algorithm:
PSO
(HCI) control collected Particle Swarm
algorithms
system via a five- Optimization Explore
tend to
Design of Human- utilizing bending- (PSO) algorithm broader real-
Wang Zhi- Gesture converge to
Computer improved PSO- sensor optimized SVM world
heng, Cao recognition local optima.
Interaction Control IEEE SVM algorithm data kernel applications of
R6 Jiang-tao, rates This research
System Based on 2017 for gesture glove. 11 parameters.Gestu gesture
Liu Jin-guo, achieved addresses this
Hand-Gesture recognition. gestures re recognition recognition-
Zhao Zi-qi 85%-100%. limitation with
Recognition The system collected integrated into based
an improved
improved with 20 HCI for real-time systems.
PSO-based
recognition samples control of a robot
optimization
accuracy and per via wireless
method.
real-time gesture. transmissioN
response for
robotic control
Developed Uses a Convex Hull Successfull Convex Hull Expand the
a virtual webcam Algorithm used to y emulates algorithm has system to
Sugnik Roy mouse and feed detect hand mouse and limitations work in
Chowdhury keyboard capturing gestures and map keyboard when dealing augmented
Gesture Recognition 2020
, Sumit using hand hand defects to mouse functions; with noise or reality with 3D
R7 Based Virtual Mouse IEEE(IC
Pathak, gesture gestures and keyboard gesture- defects. object
and Keyboard OEI)
M.D. Anto recognition in real- functions. Python based Limited to interaction.
Praveena and image time. No on Anaconda inputs can basic gestures Implement a
processing. external platform for code achieve due to the multi-
The system hardware implementation. seamless constraint of dimensional
43 | P a g e
control in
various
maps application
gestures to is camera setup
s like
mouse required to capture
architectur five fingers.
clicks and except a gestures from
e, medical
keyboard camera. different axes.
inputs. science,
and 3D
modeling.
The paper did
not address
scalability or
Developed a 500 tests
Used skin how the Expand
virtual mouse of four
detection, motion Gesture system would gesture range,
Tsung-Han system using different
Embedded Virtual detection, recognition handle improve
Tsai, Chih- hand gesture hand
Mouse System by IEEE labeling accuracy complex, environmental
R8 Chi Huang, recognition gestures
Using Hand Gesture 2015 algorithm, convex ranged dynamic robustness,
Kung-Long with high for
Recognition hull algorithm, from 82% gestures or and explore
Zhang accuracy in accuracy
and FSM for to 95%. interactions immersive HCI
harsh evaluatio
mouse functions with different applications
environments. n.
lighting
conditions or
skin tones.
Developed a Not The system No mention of Further
virtual mouse explicitly demonstrat how the development
Utilized color
system where mentione ed good system would could include
detection, image
cursor d; system precision in handle varying improving
Amardip processing, and
Virtual Mouse using movement is developm cursor lighting robustness
Ghodichor, IEEE object tracking in
R9 Hand Gesture and controlled ent and movement conditions or against
Binitha 2015 MATLAB. The
Color Detection using hand testing through different environmental
Chirakattu system used
gestures involve color colors, and the factors and
colored gloves for
detected via real-time detection dependency integrating
finger detection
color markers, images but on camera multimedia
with no captured depended quality could services using
44 | P a g e
additional on camera
hardware resolution
through a limit its
required for optimal hand gestures
webcam. application.
beyond a performanc
webcam. e.
The proposed
system
eliminates the
The system
need for a Improvements
The system uses demonstrat The system is
physical include better
Python and ed an heavily
Kabid mouse, Real-time performance
OpenCV, accuracy dependent on
Hassan allowing users frames under varied
capturing video rate of 78- lighting
Shibly, to control captured lighting and
frames via a 91% on conditions and
Design and Samrat mouse using a backgrounds,
webcam, plain background
Development of Kumar Dey, IEEE functions webcam and adding
R10 performing color background simplicity for
Hand Gesture Based Md. Aminul 2019 using hand for color more
detection, and s but optimal
Virtual Mouse Islam, gestures detection functionalities
recognizing hand dropped performance,
Shahriar detected via and like window
gestures to significantl and struggles
Iftekhar colored caps gesture manipulation
control mouse y (40-42%) with accuracy
Showrav on fingertips, tracking. using
movements and on complex in complex
processed additional
clicks. background environments.
through gestures
s.
computer
vision
techniques.
The proposed It tested Hand Detection The system The accuracy
Dinh-Son Expand the
system offers four and performed dropped as
Tran, Ngoc- system’s
fingertip different Segmentation: well across the number of
Real-time Virtual Huynh Ho, application to
SPRIN detection and screen Hand regions are different people
Mouse System using Hyung- other smart
R11 GER tracking in resolution extracted from lighting increased, and
RGB-D Images and Jeong Yang, environments,
2020 real-time, s, under depth images levels, slight
Fingertip Detection Soo-Hyung incorporating
allowing users normal using Kinect V2. background confusion
Kim, Guee a wider range
to control a lighting Hand contours s, and occurred with
Sang Lee of gestures.
virtual mouse and are obtained distances. rapid fingertip
45 | P a g e
Performanc
e
different
decreased
condition
on a screen with more movements,
s,
without using through a border- people especially for
including
gloves or tracing algorithm. tracked more complex
complex
markers. (accuracy gestures.
backgrou
dropped to
nds.
53.35% for
six people).
High
acceptance
urther
Introduced rate from
Uses a large development
Handpad, a User tests users, with
multi-touch for seamless
Clerc- Journa virtual mouse conducte positive Performance
surface (tablet) to integration
Manne l system for d with 31 feedback expectancy
track hand into smart
A Virtual Mouse for Taing, Pei- Article smart homes participan on ease of was lower,
movements for home
R12 Controlling Laptops Luen BY that integrates ts in two use and possibly due
controlling a environments
in a Smart Home Patrick Rau, SPRIN touchscreens scenarios: practicality, to the hedonic
laptop's onscreen and additional
Hanjing GER with hand living but nature of the
cursor in smart studies for
Huang 2017 movements to room and moderate tasks.
home refining
control bedroom. performanc
environments. interaction
devices. e
techniques.
expectation
s.
Developed a :5 Uses a single The system Plan to
The system's
virtual mouse participan camera to track required improve the
processing
system using a ts the user’s more time image
A Virtual Mouse Takashi speed needs
SPRIN twisting finger performe fingertip and to processing
System Using Finger- Kihara, improvement
R13 DER gesture to d a task of detects twisting complete speed using
Gestures of Twisting- Makio to handle
2011 control mouse pointing gestures to tasks CUDA
in Ishihara more complex
events. The and generate mouse compared technology to
tasks
system allows clicking events like to a regular reduce delays
efficiently.
for intuitive within clicking and mouse due and enhance
46 | P a g e
to
rectangle processing
interactions s 25 times delays, but
like twisting using this it showed
and pushing, system, a potential
dragging. usability.
simulating traditiona for tasks
real-world l mouse, with a
actions. and a lower index
touchpad of
difficulty.
Developed a
Achieved Only static
real-time
CNN-based 99.8% gestures are
gesture-based 19,852
gesture gesture supported; Extend to
HCI system images of
A Real-time Hand recognition with recognition Challenges dynamic
using a CNN 16
Gesture Recognition preprocessing, accuracy; with gesture
for recognizing gesture
R14 and Human- Pei Xu 2017 Kalman filter for Smooth background recognition
gestures and types
Computer mouse tracking, mouse removal and and complex
controlling collected
Interaction System probabilistic control transient HCI/HRI
mouse/keyboa from 5
model to avoid using gesture systems
rd with a users
false gestures Kalman detection
monocular
filter remain
camera.

Smooth
Gesture-based Limited to
RGB to YCbCr mouse Reduce lag,
mouse control static
Not conversion, events, but add features
R15 Gesture Recognition Rachit Puri 2017 using color gestures;
specified MATLAB distance like zoom,
Based Mouse Events caps for finger relies on color
implementation affects shutdown
tracking markers
accuracy

R16 Hand Gesture H. 2014 A dynamic A Image The The system Implement
Recognition Renuka, hand gesture database Acquisition: Han system is effective real-time
System to Control B. Goutam recognition of 5 hand d gestures are successful for static gesture
Soft Front Panels system that gestures, captured using ly gestures but recognition

47 | P a g e
recognize
s gestures
allows users
and lacks real-
to control
executes time
media player with 40
correspon recognition
operations images
ding capabilities
through hand per
operations and is
movements, gesture, a webcam. capabilities.
(e.g., play, sensitive to
enhancing totaling
pause) environmenta
Human- 200
based on l factors like
Computer images.
the lighting
Interaction
number of conditions.
(HCI).
matched
patterns.
Implement
Image
Viraj real-time
acquisition,
Shinde, recognition,
Developed preprocessing, Lacks real-
Hand Gesture Tushar improve
hand gesture feature time
Recognition Bacchav, Not Not robustness to
R17 2014 recognition extraction using capability,
System Using Jitendra specified quantified environmenta
system for adaptive color sensitive to
Camera Pawar, l factors,
HCI HSV model and lightin
Mangesh explore
motion history
Sanap advanced ML
image
techniques
R18 Hand Gesture Akshaya Not The system Not Uses OpenCV Accuracy Provides a Incorporate
Recognition Based Ramacha specifi successfully explicitly for video metrics more natural more
Presentation ndran, ed uses hand mentione capture, not and engaging gestures, add
System Raksha gestures to d CVZone.HandTr provided, way to voice
Aruloli, control ackingModule but the control interface,
Taran PowerPoint for hand system is presentations improve
Akshay presentations detection, and described compared to cross-
. NumPy for as working traditional platform
mathematical effectively input compatibility
operations. for basic devices. May
slide be

48 | P a g e
particularly
navigation
useful for
and
those with
annotation
limited
tasks.
mobility.

Sakshi Developed a Need for


Accurate
Sharma, gesture- Arduino, flex intuitive Incorporate
Hand Control and
Shubhank Not controlled Not sensors, control; more
R19 Using Gesture precise
Sharma, stated robotic hand; specified accelerometers, useful in complex
Recognition movement
Piyush can pick and Bluetooth hazardous gestures
s
Yadav place objects environments

Sakshi Developed a Need for


Accurate
Robotic Hand Sharma, gesture- Arduino, flex intuitive Enhance
and
Control Using Shubhank Not controlled Not sensors, control; gesture
R20 precise
Gesture Sharma, stated robotic hand; specified accelerometers, useful in recognition
movement
Recognition Piyush can pick and Bluetooth hazardous accuracy
s
Yadav place objects environments

R21 Mouse controlled Dhananjay IJRPR The proposed the The methodology The system Current While the
using hand gestures Rathod, March system enables project of this project demonstrat limitations system works
recognization Sujal 2023 controlling a focuses involves several ed superior include minor effectively,
Shinde, computer's on real- stages, from accuracy inaccuracies in future research
Pronit mouse time capturing hand compared right-click could enhance
Ghosh, functions video gestures via to other functions and the finger-tip
Karthika using hand input webcam to models. difficulties in detection
Thevar, gestures using a mapping them However, precise text algorithm to
Sangita instead of a webcam onto mouse issues were selection improve
Bhoyar physical for functions.System identified through
mouse. gesture Overview: with dragging.
The AI-based recognitio specific
virtual mouse n. The objective is functions
system showed to develop an AI- like right-
higher based virtual click and
49 | P a g e
mouse system that
uses hand
gestures as input
accuracy and for cursor
performed movements and
drag-and-
better than clicks, eliminating
select
existing the need for a
operations.
models. physical mouse
.

Gesture
Recognition: The
The study system detects The virtual The proposed
real-time
Interna proposes a hand movements mouse system has the
video The study
tional virtual mouse using a camera, demonstrat potential to
input highlights
Journa system extracts features ed make human-
Meenatchi from a challenges
l of controlled by using computer promising computer
R., Nandan webcam such as the
Virtual Mouse Using Curren hand gestures vision techniques, performanc interaction
C., serves as limited
Hand t to provide a and employs e, more
R22 Swaroop the accuracy of
GestureLink/DOI: Scienc more natural machine learning achieving accessible,
H.G., primary certain
IJCSPUB e and ergonomic algorithms (e.g., accurate especially for
Varadharaj data gestures
(IJCSP way of convolutional gesture users with
u S. source for computer
UB),2, interacting neural networks recognition physical
gesture vision
May with or decision trees) and smooth impairments.
recognitio algorithms.
2023 computers. to classify cursor
n
gestures. control.. .

The Computer Vision . Challenges in The authors


Kavitha R.,
Hand Gesture It enhances system is Techniques: maintaining propose future
Janasruthi
Controlled Virtual accessibility trained on Limitations gesture work to
R23 S. U., Year:
Mouse Using for users with a dataset Uses OpenCV to include recognition improve
Lokitha S., 2023
Artificial Intelligence physical of hand capture frames occasional accuracy under emotion
Tharani G.
disabilities and gestures from the webcam, errors in varying detection by
50 | P a g e
convert images
from BGR to
but gesture
offers an RGB, and track
specific recognition,
alternative hand movements.
details especially incorporating
interface for
regarding in multimodal
environments A rectangular
the challenging data, including
where region on the lighting
dataset lighting physiological
traditional screen marks the environments.
source or conditions, signals like
input devices area for detecting .
size are and slight heart rate and
are gestures to
not delays in body
impractical. perform cursor
provided. processing temperature.
functions.
complex
.
gestures.
.
The
model is The system
This paper trained on demonstrat
introduces a real-time ed smooth Lighting
Dr. G. virtual mouse video and Sensitivity:
Gesture
Krishna system that input responsive The model
Recognition and Enhanced
Kishore, M. leverages hand captured gesture shows
Encoding: Gesture
Dhyanesh, gesture from a recognition. performance
Gestures are Recognition:
R. Vijay recognition to webcam, However, drops in poor
Virtual Mouse Using encoded using Improve the
Kumar, IJFMR control cursor though occasional lighting
R24 Hand Gesture Enum for model’s
MD. 2023 movements specific issues with environments,
Recognition standardized performance
Ameena, without datasets gesture which can
gesture under different
M. Surya requiring used for misinterpre affect gesture
identification. environmental
Teja, R. physical input training tation in detection
conditions .
Vijay devices. the low-light accuracy.
Krishna . system conditions Limited
are not were noted. Gesture Set.
explicitly
mentione
d.

51 | P a g e
Enhanced
The
The system gesture
system
. performed support:
relies on Detect hand
well across The system Introduce
G. N. real-time landmarks and
It achieves various might face more advanced
Srinivas, S. input draw bounding
cursor light challenges gestures for
Sanjay from a boxes around
movement, left conditions, with more additional
Pratap, V. webcam, hands.Identify
and right achieving complex control.
Virtual Mouse S. IJRET with which fingers are
clicks, drag- high gestures or
R25 Control Using Hand Subrahman Feb gestures up or down to
and-drop, and accuracy in large Adaptation in
Gesture Recognition yam, K. G. 2023 detected trigger specific
volume detecting variations in dynamic
Nagapriya, on-the-fly mouse
adjustments gestures hand size environments:
A. Venkata rather actions.Cursor
with real-time and among users. Improve
Srinivasa than using movement,
hand executing robustness to
Rao a left/right click,
recognition. mouse . lighting and
predefine and drag-and-
functions. background
d dataset.
variations.
.

The system The Python with


Bharath The model
achieves experime libraries like Improve
Kumar achieved Right-click
cursor ntal OpenCV, Gesture
Reddy 99% accuracy
movement, left evaluation MediaPipe, Detection:
Sandra, accuracy remains a
and right involved PyAutoGUI, and Enhance
Katakam for most challenge,
clicks, 600 hand- Pynput.Gesture algorithms for
Harsha Irjmet functions, requiring
Gesture-Control scrolling, and labeled Recognition: more accurate
R26 Vardhan, s April including further
Virtual Mouse other mouse gestures, Finger landmarks right-click
Ch. Uday, 2022 scrolling, improvement
functions performed are identified detection and
V. Sai clicks, and in detection
using hand by four using MediaPipe. complex
Surya, Bala no-action techniques.
gestures participan The algorithm operations.
Raju, Dr. states.
captured by a ts under detects raised
Vipin .
webcam. different fingers to trigger .
Kumar
lighting specific mouse
52 | P a g e
and
distance functions.Cursor
conditions .
.
The
system
operates
in real- The system
. time demonstrat
using ed good Adaptive
C. N. Python, OpenCV,
Human- video performanc Zooming:
Sujatha, MediaPipe, and
Computer input e under Hardware Implement
S.P.V Autopy libraries
Interaction from a varying limitations: zoom
Subbarao, were used for
IJCRT (HCI) benefits webcam; lighting The system's functions
Virtual Mouse Using P. computer vision
R27 June from such a no conditions performance based on the
Hand Gestures Preetham, and gesture
2022 system by specific and depends user's distance
P. recognition.Image
enhancing datasets achieved heavily on from the
Surajvarma Preprocessing:
accessibility were reliable camera quality camera.
, Y. Captured frames
and removing mentione gesture and computer
Upendra are standardized
hardware d, as the recognition processing .
to ensure accurate
dependencies. focus is in real-time power.
gesture
on live scenarios.
detectionGesture
gesture
recognitio
n .
R28 Virtual Mouse Shudhansh IJERC This paper The Hand Detection: The system Gesture Lighting
Control Using Hand u Ranjan, SE 11 proposes a system Image frames are demonstrat Limitations: Optimization:
Gesture Sumit Nove virtual mouse relies on analyzed, and es Limited Improve the
Anand, mber system real-time skin color improved recognition of algorithm to
Vinayak 2022 controlled input extraction is stability complex handle varying
Gontia, through hand from a performed to compared gestures, lighting
Aman gestures, webcam detect hand to earlier highlighting environments
Chandra, eliminating the to detect regions.Gesture gesture- the need for more
Vineet need for and track Recognition: based gesture effectively.
Sharan, physical input hand Hand movements systems. expansion.
Spandana devices. movemen and angles are
53 | P a g e
ts; no
external analyzed to map
datasets specific gestures
SG
. were to mouse
mentione operations.
d. .
The paper
proposes a Expand
real-time hand Python, OpenCV, Gesture
High
gesture The and CNN models Limited Library: Add
accuracy in
recognition dataset (VGG19) were Gesture Set: more
Pradnya recognizing
system for consists used for training Only 10 operations,
Kedari, gestures
Controlling IJRET controlling a of 10 and real-time gestures were such as
Shubhangi due to the
R29 Computer Using 6 June computer hand gesture defined; scrolling,
Kadam, use of a
Hand Gestures 2022 using gestures recognitionAfter expanding the volume
Rajesh CNN
Convolutional assigned training, the gesture set control, and
Prasad model.
Neural to system captures would enhance swiping
Networks different hand images via the system's gestures.
.
(CNN). computer webcam. capabilities.
operations .
. .
Accuracy
Developed a in gesture
Mr. E. Need for
virtual mouse OpenCV, Media recognition,
Virtual Mouse Using Sankar, B. improved Improve
system Pipe, PyAutoGUI but
Hand Gesture (DOI: Nitish IJRET Not fingertip fingertip
R30 controlled by for hand tracking challenges
10.55041/IJSREM21 Bharadwaj, 2023 specified detection detection
hand gestures and gesture with
501) A.V. algorithms and algorithms
using a recognition. dragging
Vignesh precision i
webcam. and
clicking.

54 | P a g e
55 | P a g e
References

1. Kansal, S. (2023). Volume Control Feature for Gesture Recognition Systems. IEEE.
2. Chitra, R., Prathyusha Reddy, A., & Bamini, A. (2022). Hand Gesture Recognition using
Shapebased Image Processing Techniques for Music Control. IEEE (ICOSEC).
3. Shruti Kansal, (2023). Gesturebased Volume Control using OpenCV, MediaPipe, and
Pycaw.
4. Chitra, R., et al. (2022). Real-Time Gesture Recognition for Media Control with Shape-
based Processing.
5. Woo, J.-H., & Kim, J.-H. (2020). Environmental Challenges in Gesture Recognition
Systems.
6. Maiga Chang, et al. (2020). Custom Gesture Datasets for Virtual Control Systems.
7. Ekaterina Kochmar, et al. (2021). Standardization Challenges in Gesture Recognition. 8.
Michael Timms, (2016). Machine Learning for Gesture Interfaces.
8. Vujinovic, A., & Luburić, N. (2024). AI-based Gesture Control for Public Spaces.
9. Hendro Margono, et al. (2024). AI’s Role in Gesture Recognition Usability Studies.
10. Gan, W., et al. (2019). Gesturebased Control Techniques: A Review.
11. Bai, X., & Stede, M. (2022). Advanced Machine Learning for Gesture Recognition.
12. Prada, M.A., et al. (2020). Engineering Use Cases for Gesture Control.
13. Bassner, P., & Frankford, E. (2024). Touchless Interaction for Smart Devices.
14. Seo, K., et al. (2021). Impact of Lighting Conditions on Gesture Control Systems.
15. Tan, D.Y., & Cheah, C.W. (2021). Developing Virtual Mouse Systems using OpenCV.
16. Kim, W.-H., & Kim, J.-H. (2020). Real-Time AI Systems for Gesture Recognition.
17. Lin, C.-C., et al. (2023). Gesture Systems for Sustainable Computing Environments.
18. Brue, R., et al. (2024). Haptic Feedback in Gesture-based Systems.
19. Gan, W., & Sun, Y. (2019). Adaptive Interfaces for Gesture Control.
20. Morris, W., et al. (2024). Evaluating Performance Metrics for Virtual Mouse Systems.
21. Seo, K., & Yoon, D. (2021). AIbased Gesture Systems for Online Learning.
22. Yang, D.Y., & Nagashima, T. (2021). Improving Public
23. Interfaces with Gesture Controls.
24. Arnau-González, P., et al. (2023). Natural Language and Gesture Systems.
25. Margono, H., et al. (2024). Analyzing Gesture Control in Educational Settings.
26. Bassner, P., et al. (2024). Gesture Control for Software Development Tools.
27. Timms, M.J. (2016). Smart Classrooms and Gesture-based Interfaces.
28. Nye, B.D. (2014). Global Trends in AI-based Gesture Systems.
29. Minn, S. (2022). AI-assisted Knowledge and Gesture Systems.
30. Arnau, D., et al. (2023). Integrating Gesture Recognition into Learning Systems

56 | P a g e
57 | P a g e
Problem
Identification

58 | P a g e
VIRTUAL MOUSE USING HAND GESTURE

Problem Statement
With the rapid evolution of technology, traditional input methods such as keyboards and mice
may no longer be efficient or accessible for all users. There is a growing demand for more
intuitive, contactless, and accessible ways of interacting with digital systems. Hand gesture
recognition offers a potential solution by enabling users to control computers through natural
gestures. This technology is particularly useful for scenarios requiring hands-free control, such as
healthcare, virtual reality environments, or for people with disabilities who find conventional
input devices challenging to use.

Purpose

The purpose of this project is to design and implement a system that allows users to control
computer functions using hand gestures. This approach aims to provide an innovative, user-
friendly, and efficient alternative to traditional input devices. It seeks to demonstrate how human
gestures can seamlessly translate into digital commands, enhancing user experience and
accessibility in a variety of applications.

Scope

This project will focus on developing a prototype that uses a camera or sensors to capture hand
movements and recognize specific gestures. The system will be integrated with basic computer
functionalities, such as navigating files, controlling multimedia, or browsing the web.
Additionally, the project will explore various machine learning models and algorithms for
gesture recognition, ensuring high accuracy and responsiveness.

Features

1. Gesture Recognition: Identify predefined hand gestures to control computer operations.

2. Customizable Commands: Allow users to configure gestures for specific tasks.

59 | P a g e
3. Real-Time Processing: Ensure low-latency gesture recognition for smooth interaction.

4. Integration with Existing Systems: Compatible with Windows and macOS operating systems.

5. Camera-Based Control: Operates with standard webcams or dedicated sensors like Leap
Motion.

Advantages

1. Accessibility: Offers hands-free control for individuals with limited mobility or disabilities.

2. Hygienic Use: Ideal for medical and industrial environments where touchless control is
needed.

3. Natural Interaction: Mimics human communication and movement, reducing the learning
curve.

4. Immersive Experience: Enhances applications like VR or augmented reality (AR).

Disadvantages

1. Limited Gesture Set: Complex gestures may be difficult to implement or recognize accurately.

2. Environmental Constraints: Lighting and background noise can affect camera performance.

3. Processing Power: Real-time gesture recognition may require high computational resources.

4. Dependency on Hardware: Performance may vary based on the quality of the camera or
sensor.

60 | P a g e
Project
Portfolio

61 | P a g e
Name of student :Sakshi Navanath Rale

Semester: 5th

Programme/Branch: Computer Engineering (CO)

Roll No: 220443

Title of the Project: VIRTUAL MOUSE USING HAND GESTURE

Name and Designation of Project Guide: Ms. Zaibunnisa Malik


(HOD-CO and Principal Aided),

Name of Institute: M.H. Saboo Siddik Polytechnic (0002)

Name of student : Aafiya Ismail Sayyed

Semester: 5th

Programme/Branch: Computer Engineering (CO)

Roll No: 220446

Title of the Project: VIRTUAL MOUSE USING HAND GESTURE

Name and Designation of Project Guide: Ms. Zaibunnisa Malik


(HOD-CO and Principal Aided),

Name of Institute: M.H. Saboo Siddik Polytechnic (0002)

62 | P a g e
Name of student : Mahek Rauf Fodkar

Semester: 5th

Programme/Branch: Computer Engineering (CO)

Roll No: 230483

Title of the Project: VIRTUAL MOUSE USING HAND GESTURE

Name and Designation of Project Guide: Ms. Zaibunnisa Malik


(HOD-CO and Principal Aided),

Name of Institute: M.H. Saboo Siddik Polytechnic (0002)

63 | P a g e
After Finalization of Project Topic & Formation of Project Team

(Answer to the following questions to be included in ‘Portfolio’ as reflection related to


formation of group and finalization of project topic).

1. How many alternatives we thought before finalizing the project topic?

Ans. We thought of 3 alternatives before finalizing the project topic Email spam filtration, E-
commerce Smartphone Application, Placement Cell - A Web based Application.

2. Did we consider all the technical fields related to branch of our diploma programme?

Ans. Yes, we considered all the fields.

3. Why we found present project topic as most appropriate?

Ans. Holland’s career concept implementation was more interesting. Hence, we decided to
make a project on stream analysis using his model.

4. Whether all the group members agreed on the present project topic? If not? What were
the reasons of their disagreement?

Ans. Yes, all the members agreed

5. Whether the procedure followed in assessing alternatives and finalizing the project topic
was correct? If not then discuss the reasons.

Ans. Yes, the procedure followed was correct

6. What were the limitations in other alternatives of project topic?

Ans. The alternatives chosen already existed

7. How we formed our team?

Ans. Team was formed during starting weeks of the term, as all of us were together in
previous micro projects too, our ability to cooperate and coordinate was good

8. Whether we faced any problem in forming the team? If yes, then what was the problem
and how was it resolved?

Ans. No problems were faced

9. Am I the leader of our project team? If yes, then why was I chosen? If not, why I could
not become the project team leader?

Ans. Yes, I was chosen by the agreement of other team members

64 | P a g e
10. Do I feel that present team leader is the best choice available in the group? If yes, then
why? If not then why?

Ans. Yes, I intend to work hard and make my team succeed

11. According to me who should be the leader of the team and why?

Ans. All of us works well, I was chosen by the agreement of other team members

12. Can we achieve the targets set in the project work within the time and cost limits?

Ans. We will try our best to complete the project within the provided time and cost limits

13. What are my good/bad sharable experiences while working with my team which
provoked me to think? What I learned from these experiences?

Ans. The experience was well and good, all of the team members cooperated and worked
together. I learned about website development, working in team environment and working
with AI.

14. Any other reflection which I would like to write about formation of team and finalization
of project title, if any?

Ans. No

After Finalization of Project Proposal

(Answer to the following questions to be included in ‘Portfolio’ as reflection on planning)

1. Which activities are having maximum risk and uncertainty in our project plan?

Ans. Predicting right career involves maximum risk

2. What are most important activities in our project plan?

Ans. Finding the questionnaire and keys was the most important activity

3. Is work distribution is equal project group members? If not? What are the reasons? How
we can improve work distribution?

Ans. Yes

4. Is it possible to complete the project in given time? If not then what are the reasons for it?
How can we ensure that project is completed within time?

Ans. Yes

65 | P a g e
5. What extra care and precaution should be taken in executing the activities of high risk
and uncertainty? If possible, how such risks and uncertainties can be reduced?

Ans. Students details should be saved properly in database

Ans. Taking care of the database and maintaining confidentiality, integrity of data

6. Can we reduce the total cost associated with the project? If yes, then describe the ways.

Ans. Yes, by dividing work equally and preparing the project as early as possible

7. For which activities of our project plan, arrangement of resources is not easy and
convenient?

Ans. Finding the questionnaire

8. Did we make enough provisions of extra time/expenditure etc. to carry out such
activities?

Ans. Yes

9. Did we make enough provisions for time delays in our project activity? In which
activities there are more chances of delay?

Ans. No. Until now we haven’t find such activities which involves more delays.

10. In our project schedule, which are the days of more expenditure? What provisions we
have made for availability and management of cash?

Ans. Development phase will be of more expenditure. We will make it accordingly.

11. Any other reflection which I would like to write about project planning?

Ans. No

66 | P a g e
67 | P a g e
Project
Logbook

68 | P a g e

You might also like