Virtual Mouse Final Report-1
Virtual Mouse Final Report-1
(2020-2021)
PROJECT REPORT ON
BY
220443-SAKSHI RALE
220446-AAFIYA SAYYED
230483-MAHEK FODKAR
1|Page
Anjuman-I-Islam’s
M.H.Saboo Siddik Polytechnic
8, M.H.Saboo Siddik Polytechnic Road, Mumbai 400008
Certificate
Enrollment No. 2200020350 has completed Final Project Report having Title
VIRTUAL MOUSE USING HAND GESTURE during the academic year 2024 –
_____________
2|Page
Date: ____________ Sign of HOD: _____________
3|Page
INDEX
Obtained
Sr.No. Title Marks
Marks Faculty Sign with Date
1. Problem Identification 02
3. Project Proposal 03
4. Execution of Plan 02
7. Project Portfolio 04
Total 25
4|Page
Project
Report
5|Page
Acknowledgement
We would firstly like to thank our Principal (I/c), Head of the Department & Guide
Ms. Zaibunnisa Malik for encouraging and motivating us with her guidance and
total support for our work. We shall also like to thank Ms. Munira Ansari for
working as our sub guide and making our path to integrity much simpler.
We also thank all the teachers who constantly motivated us and provided us their
precious knowledge about the procedures carried out for making a project along
with technical knowledge they have availed.
We would also like to thank our principal Mr. A.K Qureshi for providing us this
Opportunity of integrating our own project and constantly supporting us
throughout the process.
6|Page
Abstract
(One paragraph 150 words)
This project explores the development of a virtual mouse controlled by hand gestures, aiming to
provide an innovative way for users to interact with their computers. Traditional mice can be
challenging for some users, particularly those with physical disabilities. By utilizing advanced
sensors and cameras, our system captures hand movements and translates them into
corresponding mouse actions, such as clicking, dragging, and scrolling.
The project involves creating an intuitive interface that recognizes specific gestures, ensuring
smooth and accurate control. Users can navigate applications, websites, and other digital content
without needing a physical mouse, promoting accessibility and ease of use.
This technology not only enhances user experience but also encourages hands-free interaction,
making it suitable for various environments, including homes, offices, and educational settings.
Ultimately, our goal is to make computing more inclusive and efficient, enabling everyone to
engage with technology seamlessly.
7|Page
Table of Content
1.1.Introduction
1.2.Background
1.3.Motivation
1.4.Problem Statement
1.
1.5.Objective and Scope
1.6.Advantages
1.7.Disadvantages
1.8.Limitations
1.9.Conclusion
Literature Survey
2.1.Introduction
2.
2.2.Research Papers
2.3.References
2.4.Conclusion
8|Page
Sr. No. Chapter Page No.
Proposed Methodology
3.1.System Design
3.1.1.Introduction
3.1.2.Block Diagram
3.4.Conclusion
9|Page
Chapter 1: Introduction and Background
Content:
1.1. Introduction
1.2. Background
1.3. Motivation
1.4. Problem Statement
1.5. Objective and Scope
1.6. Advantages
1.7. Disadvantages
1.8. Limitations
1.9. Conclusion
10 | P a g e
1.1. Introduction
In today’s tech-driven world, it’s important to create easy and hands-free ways to control digital
devices. The project "Virtual Mouse Using Hand Gesture" aims to let users control a computer’s
cursor just by using hand gestures. This method removes the need for traditional devices and
makes it easier for users to interact with technology in different settings.
Instead of using physical mice or touchpads, the Virtual Mouse project uses image recognition to
detect and understand hand movements in real time. By recognizing specific gestures, the system
can perform common tasks like moving the cursor, clicking, and scrolling, allowing for a
smooth, contactless experience.
1.2. Background
The "Virtual Mouse Using Hand Gesture" project responds to the need for new and accessible
ways to interact with technology. Traditional devices, like mice and keyboards, can be hard for
some users to use, especially in places that require touch-free operation, like hospitals or smart
homes.This project uses gesture recognition to create a hands-free alternative to traditional
devices. By translating hand movements into cursor actions, users can navigate their computers
without touching anything. As technology moves towards more user-friendly designs, this
project offers a modern solution for more inclusive tech experiences.
1.3. Motivation
The motivation behind the "Virtual Mouse Using Hand Gesture" project is to provide more
accessible and intuitive ways for people to interact with computers. As demand for touchless
technology grows, there’s a need for alternatives to standard input devices, especially for users
with mobility issues or in situations where contact should be minimized.
This project was inspired by observing how users struggle in certain settings, like healthcare,
where cleanliness is crucial, or for those who need special technology. By creating a hands-free
interface that uses simple gestures, the project aims to improve user experience and promote
accessibility.
Traditional devices like mice and keyboards can be hard to use without hands. In places where
touch-free interaction is necessary—like sterile environments, for people with mobility
challenges, or in smart homes—traditional methods may not work well. Also, standard devices
often lack the adaptive interaction that users expect today.
11 | P a g e
The "Virtual Mouse Using Hand Gesture" project aims to solve these problems by offering a
gesture-based interface that allows users to control a computer's cursor easily. By using gesture
recognition technology, this project provides a touch-free experience that meets the needs of
various users and settings, improving accessibility and usability.
Objective
To study existing methods for recognizing hand gestures.
To gather and analyze data on common hand gestures for cursor actions (like moving,
clicking, scrolling).
To use machine learning to interpret hand gestures accurately.
To create a virtual mouse system that converts hand gestures into real-time cursor
movements.
Scope
This project will explore new ways to interact through gesture recognition and real-
time processing. It aims to create a hands-free, user-friendly tool for controlling
digital devices, overcoming the limitations of traditional input methods. The virtual
mouse system is designed for various uses, from helping people with limited mobility
to providing touch-free control in clean or specialized environments, leading to a
more adaptable and inclusive tech experience.
1.6. Advantages
Enhanced Accessibility: Makes it easier for users with limited mobility to use a touch-free
interface.
Engaging User Experience: Offers an interactive tool that appeals to tech-savvy users.
Inclusive Design: Supports a wide range of users, including younger students and those with
different accessibility needs.
Innovative Appeal: Provides a modern approach that stands out from traditional career
guidance tools.
12 | P a g e
1.7. Disadvantages
Lack of Adaptation for Unique Circumstances: Some users may have specific needs that
the gesture system may not effectively meet.
User Device and Connectivity Constraints: Not all users have access to the necessary
devices or reliable internet, limiting the technology’s reach.
Data Quality and Bias: The system’s performance depends on the quality of training data;
biased or insufficient data can lead to inaccuracies.
Lack of Human Element: A virtual mouse may miss the personal interaction that can
enhance user experience.
Inflexibility: The system may not cater to unique user preferences, reducing its usefulness for
some.
Dependency on Technology: Users might rely too much on gesture technology and struggle
with traditional input methods later on.
1.8. Limitations
Not for Everyone: Some users, particularly those with disabilities, may find gesture systems
difficult to use.
Device and Internet Issues: Users may lack the necessary devices or stable internet access,
hindering their ability to use the technology.
Data Problems: The system functions best with high-quality data. If the data is flawed or
biased, gesture recognition may suffer
.
Missing Personal Touch: Technology may lack the personal connection that comes from
talking to a real person, important for understanding user needs.
Rigidity: Some users may have preferences that the gesture system cannot adapt to easily.
Too Much Dependence: Users might become overly reliant on the technology, making
traditional methods harder to use later.
13 | P a g e
1.9. Conclusion
The virtual mouse project using hand gestures is an important step forward in how we interact
with computers. However, it’s important to recognize its limitations, including accuracy issues
with gesture recognition, privacy concerns about data collection, and the absence of personal
interaction.
To make the most of this project, it should be seen as one tool among many for controlling a
computer. The human aspect remains crucial, especially in understanding individual user needs.
Balancing tech-driven interactions with personalized support is vital for a good user experience.
Continuous improvements to address these limitations will be essential for the project’s success
and its ability to meet the varied needs of users in our fast-evolving tech landscape.
14 | P a g e
15 | P a g e
Chapter 2: Literature Survey
Content:
2.1. Introduction
2.2. Research Papers
2.3. References
2.4. Conclusion
16 | P a g e
2.1. Introduction
{ content font name=”Times New Roman”, Size= 12, style=regular}
Paper Title 1: Volume Control feature for gesture recognition in Augmented and Virtual reality
applications
Author: Shruti Kansal
Published in: IEEE 2023
Abstract: This paper presents a gesture-based volume control system designed specifically for
Virtual and Augmented Reality applications. By utilizing custom gestures for interaction, the
system demonstrates effective recognition and control of volume settings in real-time. While the
results show promise, the study identifies limitations in gesture vocabulary and contextual
challenges during varied user scenarios. Future work aims to broaden the range of gestures and
explore additional applications within VR/AR environments.
Paper Title 2: Hand Gesture Recognition using Shape-based Image Features for Music
Controller
Author: Chitra R., Allam Prathyusha Reddy, Anusha Bamini A.M, Naru Kezia
Published in: IEEE (ICOSEC) 2022
Abstract: This research develops a real-time hand gesture recognition system tailored for
controlling music playback functions. By detecting key shape-based features such as finger
orientation and posture, the system enables users to interact with music players seamlessly.
Despite its effectiveness in recognizing specific gestures, the system encounters challenges in
various environments, particularly with background noise and lighting conditions. Future
directions include expanding the gesture set to accommodate diverse user interactions and
improving robustness across different settings.
Paper Title 3: Volume Control feature for gesture recognition in Augmented and Virtual reality
applications
Author: Shruti Kansal, H Venkateswarareddy, Shanmugasundaram Hariharan, Sasi Kala Rani
K, Andraju Bhanu Prasad, Ponmalar A
Published in: IEEE (InC4) 2023
Abstract: This paper explores advanced gesture recognition techniques focused on volume
control using palm detection and hand tracking within VR/AR contexts. The implementation
utilizes OpenCV, Pycaw, and MediaPipe libraries, showcasing promising real-time performance.
However, the study reveals challenges related to noise sensitivity in video streams and the lack
17 | P a g e
of standardized gestures. Future applications could extend to various devices, including TVs and
projectors, while enhancing gesture recognition algorithms for improved accuracy.
Paper Title 4: An Approach to Control the PC with Hand Gesture Recognition using Computer
Vision Technique
Author: Kirti Aggarwal, Anuja Arora
Published in: IEEE (INDIAcom) 2022
Abstract: This research introduces a novel hand gesture recognition system that facilitates the
control of virtual mouse and keyboard functions with an impressive accuracy of 95%. Utilizing a
webcam for real-time gesture recognition, the study enhances human-computer interaction (HCI)
by allowing intuitive control without physical devices. Despite its success, the system's
performance is heavily dependent on webcam quality and environmental conditions. Future work
aims to incorporate AI-driven classifiers to boost accuracy and reduce reliance on hardware
quality.
Paper Title 5: Hand Gesture Recognition using Shape-based Image Features for Music
Controller
Author: Chitra R., Allam Prathyusha Reddy, Anusha Bamini A.M., Naru Kezia, Brindha D,
Gyara Beulah
Published in: 2022
Abstract: This paper details the development of a gesture recognition system designed to control
music playback through hand movements. By focusing on shape-based image features, the
system captures gestures in real-time using a webcam. Although it demonstrates effective gesture
recognition, the study highlights limitations regarding the dataset and the impact of real-world
variables such as lighting and gesture diversity. Future enhancements are proposed to include
more complex gestures and improve performance under diverse conditions.
Paper Title 8: Embedded Virtual Mouse System by Using Hand Gesture Recognition
Author: Tsung-Han Tsai, Chih-Chi Huang, Kung-Long Zhang
Published in: IEEE 2015
Abstract: This paper describes an embedded virtual mouse system that utilizes hand gesture
recognition to achieve high accuracy even in challenging environments. Through a series of
tests, the system demonstrates an accuracy range between 82% and 95%. However, it does not
address scalability or performance with complex gestures in varying lighting conditions. Future
research is encouraged to expand the range of recognized gestures and improve the system’s
robustness in diverse real-world applications.
Paper Title 9: Virtual Mouse using Hand Gesture and Color Detection
Author: Amardip Ghodichor, Binitha Chirakattu
Published in: IEEE 2015
Abstract: This research presents a virtual mouse system controlled by hand gestures detected
through color markers. The system effectively achieves precise cursor movement but is heavily
dependent on camera resolution and lighting conditions. While promising, it does not explore
how to handle varying environmental factors. Future developments aim to enhance robustness
against these variables and integrate multimedia functionalities using gesture controls.
Paper Title 10: Design and Development of Hand Gesture Based Virtual Mouse
Author: Kabid Hassan Shibly, Samrat Kumar Dey, Md. Aminul Islam, Shahriar Iftekhar
Showrav
Published in: IEEE 2019
Abstract: This paper discusses the design of a virtual mouse system that enables users to control
mouse functions through hand gestures detected via color detection techniques. The system
achieves an accuracy range of 78-91% on simple backgrounds but significantly drops to 40-42%
in complex environments. The research highlights the need for improvements in performance
across varied lighting and background conditions. Future enhancements may include additional
functionalities such as window manipulation using new gesture inputs.
Paper Title 11: Real-time Virtual Mouse System using RGB-D Images and Fingertip Detection
Author: Dinh-Son Tran, Ngoc-Huynh Ho, Hyung-Jeong Yang, Soo-Hyung Kim, Guee Sang Lee
Published in: SPRINGER 2020
Abstract: This study presents a virtual mouse system that utilizes RGB-D images for real-time
fingertip detection and tracking, allowing users to control a virtual cursor on the screen without
the need for gloves or markers. The system was tested under various conditions, including
different screen resolutions and complex backgrounds. Utilizing hand detection and
segmentation techniques, the system performed well across different lighting levels and
19 | P a g e
distances, although performance dropped significantly when tracking multiple users, with
accuracy decreasing to 53.35% for six people. Future work aims to expand the system’s
applications to smarter environments and incorporate a wider range of gestures for enhanced user
interaction.
Paper Title 12: A Virtual Mouse for Controlling Laptops in a Smart Home
Author: Clerc-Manne Taing, Pei-Luen Patrick Rau, Hanjing Huang
Published in: Journal Article BY SPRINGER 2017
Abstract: This research introduces Handpad, a virtual mouse system designed specifically for
smart homes, which integrates touchscreen technology with hand movements to control devices.
User testing with 31 participants in typical home settings showed a high acceptance rate, with
users praising the ease of use and practicality. However, performance expectations were
moderate, potentially due to the hedonic nature of tasks involved. The study suggests further
development for seamless integration into smart home ecosystems and additional research to
refine interaction techniques.
Paper Title 14: A Real-time Hand Gesture Recognition and Human-Computer Interaction
System
Author: Pei Xu
Published in: 2017
Abstract: This research introduces a real-time gesture-based human-computer interaction (HCI)
system that employs a convolutional neural network (CNN) for gesture recognition. A dataset of
19,852 images representing 16 different gesture types was collected for training. The system
achieved an impressive gesture recognition accuracy of 99.8% and facilitated smooth mouse
control via a Kalman filter for tracking. Despite its strengths, the system only supports static
gestures and faces challenges in background removal and transient gesture detection. Future
improvements aim to include dynamic gesture recognition and enhance complex HCI and
human-robot interaction capabilities.
Paper Title 16: Hand Gesture Recognition System to Control Soft Front Panels
Author: H. Renuka, B. Goutam
Published in: 2014
Abstract: This research develops a dynamic hand gesture recognition system that enables users
to control media player operations through intuitive hand movements. A dataset comprising five
hand gestures was created, with 40 images per gesture. The system effectively recognizes
gestures and executes corresponding operations, such as play and pause, based on the number of
matched patterns. While successful for static gestures, it lacks real-time recognition capabilities
and is sensitive to environmental factors like lighting conditions. Future work will focus on
implementing real-time gesture recognition capabilities.
22 | P a g e
fields like gaming and virtual reality. They mention that while the system works well, there are
still areas to improve, especially regarding gesture recognition in different environments.
24 | P a g e
2.3. References
1 Kansal, S. (2023). Volume Control Feature for Gesture Recognition Systems. IEEE.
2 Chitra, R., Prathyusha Reddy, A., & Bamini, A. (2022). Hand Gesture Recognition using
Shapebased Image Processing Techniques for Music Control. IEEE (ICOSEC).
3 Shruti Kansal, (2023). Gesturebased Volume Control using OpenCV, MediaPipe, and
Pycaw.
4 Chitra, R., et al. (2022). Real-Time Gesture Recognition for Media Control with Shape-based
Processing.
5 Woo, J.-H., & Kim, J.-H. (2020). Environmental Challenges in Gesture Recognition
Systems.
6 Maiga Chang, et al. (2020). Custom Gesture Datasets for Virtual Control Systems.
7 Ekaterina Kochmar, et al. (2021). Standardization Challenges in Gesture Recognition. 8.
Michael Timms, (2016). Machine Learning for Gesture Interfaces.
9. Vujinovic, A., & Luburić, N. (2024). AI-based Gesture Control for Public Spaces.
10. Hendro Margono, et al. (2024). AI’s Role in Gesture Recognition Usability Studies.
11. Gan, W., et al. (2019). Gesturebased Control Techniques: A Review.
12. Bai, X., & Stede, M. (2022). Advanced Machine Learning for Gesture Recognition.
13. Prada, M.A., et al. (2020). Engineering Use Cases for Gesture Control.
14. Bassner, P., & Frankford, E. (2024). Touchless Interaction for Smart Devices.
15. Seo, K., et al. (2021). Impact of Lighting Conditions on Gesture Control Systems.
16. Tan, D.Y., & Cheah, C.W. (2021). Developing Virtual Mouse Systems using OpenCV.
17. Kim, W.-H., & Kim, J.-H. (2020). Real-Time AI Systems for Gesture Recognition.
25 | P a g e
18. Lin, C.-C., et al. (2023). Gesture Systems for Sustainable Computing Environments.
19. Brue, R., et al. (2024). Haptic Feedback in Gesture-based Systems.
20. Gan, W., & Sun, Y. (2019). Adaptive Interfaces for Gesture Control.
21. Morris, W., et al. (2024). Evaluating Performance Metrics for Virtual Mouse Systems.
22. Seo, K., & Yoon, D. (2021). AIbased Gesture Systems for Online Learning.
23. Yang, D.Y., & Nagashima, T. (2021). Improving Public
Interfaces with Gesture Controls.
24. Arnau-González, P., et al. (2023). Natural Language and Gesture Systems.
25. Margono, H., et al. (2024). Analyzing Gesture Control in Educational Settings.
26. Bassner, P., et al. (2024). Gesture Control for Software Development Tools.
27. Timms, M.J. (2016). Smart Classrooms and Gesture-based Interfaces.
28. Nye, B.D. (2014). Global Trends in AI-based Gesture Systems.
29. Minn, S. (2022). AI-assisted Knowledge and Gesture Systems.
30. Arnau, D., et al. (2023). Integrating Gesture Recognition into Learning Systems.
2.4. Conclusion
The research on hand gesture recognition systems for controlling virtual mice shows great
promise for improving how we interact with computers. These systems can help people,
especially those with physical disabilities, use computers more easily by allowing them to
control the mouse with their hands instead of a physical device.
While the technology is exciting, there are still some challenges. Many studies noted that the
accuracy of recognizing gestures can be an issue, especially in different lighting or with more
complex movements. There is also a need to add more gestures to make the system even more
useful.
26 | P a g e
Chapter 3: Proposed Methodology
Content:
3.1.1. Introduction
3.4. Conclusion
27 | P a g e
3. System Design .
3.1.1. Introduction
The virtual mouse using hand gesture technology aims to provide an intuitive interface for users
to interact with their devices. This system utilizes sensors, machine learning algorithms, and
computer vision techniques to interpret hand movements and gestures, allowing users to control
a cursor on the screen without physical contact
28 | P a g e
Detectin hand
Capture hand Process gesture for
user gesture performing the
movement
command following
command
Dataset Result
(succeflly executed
gesture movment)
Also known as DFD, Data flow diagrams are used to graphically represent the flow of data in a
business information system. DFD describes the processes that are involved in a system to
transfer data from the input to the file storage and reports generation. Data flow diagrams can be
divided into logical and physical. The logical data flow diagram describes flow of data through a
system to perform certain functionality of a business. The physical data flow diagram describes
the implementation of the logical data flow
29 | P a g e
3.1.5. Software Design Approach
There are many models/approaches that can be followed while developing the project. Some of
the available models are such as V-Model, Incremental Model, Spiral Model, Waterfall Model,
RAD Model, Agile Model . While implementing our project we decided to choose waterfall
model
Agile Model
Each iteration is considered as a short time "frame" in the Agile process model, which typically
lasts from one to four weeks. The division of the entire project into smaller parts helps to
minimize the project risk and to reduce the overall project delivery time requirements. Each
iteration involves a team working through a full software development life cycle including
planning, requirements analysis, design, coding, and testing before a working product is
demonstrated to the client.
30 | P a g e
.
Gantt Chart 1
Problem Identification
Industrial Survey & Literature Review
Project Proposal
Project Report
Task
Presentation
Project Logbook
Project Portfolio
0 7 14 21 28 35 42 49 56 63 70 77 84 91 98 105 112
Start on day
3.4. Conclusion
Hence, waterfall model was selected and diagrams of our project was designed. We successfully
made the research and completed basics of our project and soon will commence the
implementation of project
32 | P a g e
Project
Proposal
33 | P a g e
VIRTUAL MOUSE USING HAND GESTURE
Rationale
A virtual mouse using hand gestures offers a modern, hands-free way to control computers,
eliminating the need for traditional hardware like a physical mouse. This technology enhances
convenience by allowing users to interact with their devices through natural hand movements,
making tasks easier and more intuitive. It also provides significant benefits for individuals with
disabilities, offering an accessible alternative to conventional input devices that may be
challenging to use.
Another advantage is that using hand gestures can reduce physical strain caused by prolonged
use of a regular mouse, providing a more ergonomic experience. Additionally, this touchless
control system is ideal in settings like hospitals, where maintaining hygiene is essential, as it
minimizes the need for direct contact with devices.
The virtual mouse aligns with future trends, as gesture-based interaction is becoming
increasingly popular in technologies such as virtual and augmented reality. Developing this
system ensures that users stay updated with modern control methods. Furthermore, the virtual
mouse can work across multiple devices, including PCs, laptops, and tablets, making it flexible
and convenient for users in different environments.
Customization is another important feature. Users can adjust gestures according to their
preferences, ensuring a comfortable and user-friendly experience. With real-time gesture
detection and smooth performance, the virtual mouse ensures efficient operation, enhancing the
overall user experience.
In conclusion, a virtual mouse using hand gestures offers a practical, accessible, and future-ready
solution that combines ease of use, flexibility, and modern technology, making it a valuable
innovation.
34 | P a g e
Introduction
A virtual mouse using hand gestures allows users to control their computer without a physical
mouse by detecting hand movements in real time. This hands-free technology is convenient,
reduces physical strain, and is especially helpful for people with disabilities. It’s also useful in
places like hospitals, where avoiding touch is important. As gesture controls gain popularity in
virtual and augmented reality, the virtual mouse works on various devices like PCs and laptops,
making it a flexible and modern solution.
• Purpose
The purpose of the virtual mouse using hand gestures is to give users a hands-free and easy way
to control their computers. It aims to make using computers easier for people with disabilities
and improve convenience by removing the need for a regular mouse. Additionally, it helps
reduce physical strain from long computer use and promotes hygiene in places where touching
devices should be avoided. By using gesture recognition technology, the virtual mouse provides
a modern and flexible solution that works well on different devices, keeping users up-to-date
with new technology trends.
• Scope
The system lets users control computers using hand gestures instead of a mouse. It helps people
with disabilities use technology more easily. The virtual mouse works on personal computers and
laptops for wider use. It uses real-time gesture detection for quick and accurate responses. Users
can customize gestures and settings to fit their needs. The virtual mouse is useful in homes,
offices, schools, and hospitals.
Literature Survey
Research on virtual mice using hand gestures highlights key areas of focus. Studies explore
gesture recognition technology, utilizing computer vision and machine learning to enhance
accuracy. User experience research shows that simple gestures improve satisfaction and ease of
learning. Additionally, virtual mice assist individuals with disabilities and promote hygiene in
public spaces. There is also exploration of gesture control in augmented and virtual reality
(AR/VR) for immersive experiences. Overall, the findings emphasize the potential for innovation
in this field.
35 | P a g e
Problem Definition
The virtual mouse using hand gestures addresses issues with traditional mouse devices, which
can be difficult for many users. Accessibility is a concern for individuals with disabilities who
struggle with standard mice. Long-term use of traditional mice can cause discomfort and strain
on the hands and wrists. Hygiene is important in settings like hospitals, where physical mice can
spread germs. Additionally, traditional controls can be confusing, requiring users to learn
specific movements. With the rise of virtual and augmented reality technologies, there is a
growing need for systems that enable natural interactions. The virtual mouse aims to provide a
more accessible and hygienic way for people to use computers.
Proposed Methodology
The system uses a camera to capture hand movements and analyzes the video to detect specific
gestures like pointing, clicking, and scrolling. After detecting gestures, the system converts them
into commands that the computer can understand, filtering out background noise and recognizing
patterns.
A simple interface helps users understand how to use the virtual mouse by providing visual
feedback, such as highlighting areas for clicking or scrolling. Users can customize gestures and
settings to fit their preferences, making the experience more personal and easier to use.
• Aim
The aim is to create a virtual mouse that lets users control their computer using hand
gestures. It will be an easy, hands-free alternative to a regular mouse, improve
accessibility, work smoothly in real-time, and be compatible with different devices.
36 | P a g e
• Objective
• Create a hands-free system to control computers without a physical mouse.
Resources
• Hardware
• Suitable computer system hardware.
• CPU: A multi-core processor with a clock speed of at least 2.5 GHz is recommended.
• User Devices: Users will use the virtual mouse on their own computers, laptops, . It’s
important to make sure the virtual mouse works well on all types of devices and screen
sizes for a good experience..
• Software
Programming Language:Python – Used to write the code for the virtual mouse.
Code Editors:VS Code, PyCharm, or Jupyter Notebook – Used to write and test the code.
38 | P a g e
Industrial
Survey &
Literature
Review
39 | P a g e
VIRTUALMOUSE USING HAND GESTURE
Abstract
The "Virtual Mouse Using Hand Gestures" project offers a unique way to interact with
computers without a physical mouse. By using a camera, it tracks your hand movements and
gestures, allowing you to move the cursor, click, drag, and scroll just by moving your hands in
front of the screen.
This technology is powered by gesture-detecting software. It identifies your hands and interprets
commands based on how you move them, making it intuitive to use. For example, raising your
hand might move the cursor up, while a specific gesture can simulate a click.
Built with popular tools like OpenCV and MediaPipe, this project is designed for various
situations. It’s perfect for public spaces where hygiene is important, as it avoids touching
surfaces. It’s also great for gamers who want a more immersive experience. Additionally, it
provides accessibility for people with disabilities who may struggle with traditional mice.
Overall, this innovative system aims to make computer interaction more convenient and user-
friendly for everyone.
Literature Review
40 | P a g e
Year
Research Gap
Sr.N Name of of Performan Future
Title Findings Datasets Methdology / Your
o Authors public ce Directions
Observations
ation
Volume Control
feature for gesture Limited
Developed a Promising Expand
recognition in gesture
gesture-based Utilized OpenCV, results in gesture
Augmented and Custom vocabulary;
Shruti IEEE volume MediaPipe, and gesture recognition
R1 Virtual reality gestures context-
Kansal 2023 control system Pycaw for recognition capabilities
applications dataset specific
for VR/AR implementation. and and
https://ieeexplore.ie challenges
applications. control. applications.
ee.org/document/10 observed.
263252
41 | P a g e
Prasad,
Ponmalar A
Introduced a Dependent on
Vision-based
hand gesture webcam
An Approach to gesture
recognition quality, and Use AI for
Control the PC with recognition
system using background trained
Hand Gesture system using 95%
Kirti IEEE Computer Real-time colors can classifiers to
Recognition using OpenCV and accuracy in
Aggarwal, (INDIA Vision to images cause enhance
R4 Computer Vision Python, object controlling
Anuja com)2 control a from a recognition recognition
Technique tracking for cursor and
Arora 022 virtual mouse webcam errors; accuracy
https://ieeexplore.ie control via color click events
and keyboard performance without extra
ee.org/document/97 detection, virtual
with 95% is lower with hardware
63282 keyboard
accuracy, lower PC
implemented
improving HCI. configurations
R5 Hand Gesture Chitra R., 2022 Developed a Custom Used image Achieved Limited Expanding to
Recognition using Allam system for image processing real-time dataset; support more
Shape-based Image Prathyusha recognizing dataset of techniques like gesture additional complex
Features for Music Reddy, hand gestures hand background recognition real-world gestures and
Controller Anusha to control gestures subtraction, for music challenges improve
Bamini music captured binary image control (like performance
A.M., Naru playback by via conversion, skin with simple illumination in variable
Kezia, detecting key webcam. tone detection gestures. variation and real-world
Brindha D, shape-based with the HSV The system gesture conditions
Gyara features like model, and finger can diversity) are (e.g., lighting,
Beulah finger counting. recognize not fully clutter).
orientation Developed a gestures addressed.
and posture. custom algorithm with basic
Intended for for contour-based classificatio
multitasking gesture detection n and
environments, and used point accuracy.
it allows pattern matching
control of for gesture
music (play, recognition.
pause,
42 | P a g e
forward) with
simple hand
movements
captured
through a
webcam.
Proposed a
Human-
Computer Gesture Improved PSO-
Traditional
Interaction samples SVM Algorithm:
PSO
(HCI) control collected Particle Swarm
algorithms
system via a five- Optimization Explore
tend to
Design of Human- utilizing bending- (PSO) algorithm broader real-
Wang Zhi- Gesture converge to
Computer improved PSO- sensor optimized SVM world
heng, Cao recognition local optima.
Interaction Control IEEE SVM algorithm data kernel applications of
R6 Jiang-tao, rates This research
System Based on 2017 for gesture glove. 11 parameters.Gestu gesture
Liu Jin-guo, achieved addresses this
Hand-Gesture recognition. gestures re recognition recognition-
Zhao Zi-qi 85%-100%. limitation with
Recognition The system collected integrated into based
an improved
improved with 20 HCI for real-time systems.
PSO-based
recognition samples control of a robot
optimization
accuracy and per via wireless
method.
real-time gesture. transmissioN
response for
robotic control
Developed Uses a Convex Hull Successfull Convex Hull Expand the
a virtual webcam Algorithm used to y emulates algorithm has system to
Sugnik Roy mouse and feed detect hand mouse and limitations work in
Chowdhury keyboard capturing gestures and map keyboard when dealing augmented
Gesture Recognition 2020
, Sumit using hand hand defects to mouse functions; with noise or reality with 3D
R7 Based Virtual Mouse IEEE(IC
Pathak, gesture gestures and keyboard gesture- defects. object
and Keyboard OEI)
M.D. Anto recognition in real- functions. Python based Limited to interaction.
Praveena and image time. No on Anaconda inputs can basic gestures Implement a
processing. external platform for code achieve due to the multi-
The system hardware implementation. seamless constraint of dimensional
43 | P a g e
control in
various
maps application
gestures to is camera setup
s like
mouse required to capture
architectur five fingers.
clicks and except a gestures from
e, medical
keyboard camera. different axes.
inputs. science,
and 3D
modeling.
The paper did
not address
scalability or
Developed a 500 tests
Used skin how the Expand
virtual mouse of four
detection, motion Gesture system would gesture range,
Tsung-Han system using different
Embedded Virtual detection, recognition handle improve
Tsai, Chih- hand gesture hand
Mouse System by IEEE labeling accuracy complex, environmental
R8 Chi Huang, recognition gestures
Using Hand Gesture 2015 algorithm, convex ranged dynamic robustness,
Kung-Long with high for
Recognition hull algorithm, from 82% gestures or and explore
Zhang accuracy in accuracy
and FSM for to 95%. interactions immersive HCI
harsh evaluatio
mouse functions with different applications
environments. n.
lighting
conditions or
skin tones.
Developed a Not The system No mention of Further
virtual mouse explicitly demonstrat how the development
Utilized color
system where mentione ed good system would could include
detection, image
cursor d; system precision in handle varying improving
Amardip processing, and
Virtual Mouse using movement is developm cursor lighting robustness
Ghodichor, IEEE object tracking in
R9 Hand Gesture and controlled ent and movement conditions or against
Binitha 2015 MATLAB. The
Color Detection using hand testing through different environmental
Chirakattu system used
gestures involve color colors, and the factors and
colored gloves for
detected via real-time detection dependency integrating
finger detection
color markers, images but on camera multimedia
with no captured depended quality could services using
44 | P a g e
additional on camera
hardware resolution
through a limit its
required for optimal hand gestures
webcam. application.
beyond a performanc
webcam. e.
The proposed
system
eliminates the
The system
need for a Improvements
The system uses demonstrat The system is
physical include better
Python and ed an heavily
Kabid mouse, Real-time performance
OpenCV, accuracy dependent on
Hassan allowing users frames under varied
capturing video rate of 78- lighting
Shibly, to control captured lighting and
frames via a 91% on conditions and
Design and Samrat mouse using a backgrounds,
webcam, plain background
Development of Kumar Dey, IEEE functions webcam and adding
R10 performing color background simplicity for
Hand Gesture Based Md. Aminul 2019 using hand for color more
detection, and s but optimal
Virtual Mouse Islam, gestures detection functionalities
recognizing hand dropped performance,
Shahriar detected via and like window
gestures to significantl and struggles
Iftekhar colored caps gesture manipulation
control mouse y (40-42%) with accuracy
Showrav on fingertips, tracking. using
movements and on complex in complex
processed additional
clicks. background environments.
through gestures
s.
computer
vision
techniques.
The proposed It tested Hand Detection The system The accuracy
Dinh-Son Expand the
system offers four and performed dropped as
Tran, Ngoc- system’s
fingertip different Segmentation: well across the number of
Real-time Virtual Huynh Ho, application to
SPRIN detection and screen Hand regions are different people
Mouse System using Hyung- other smart
R11 GER tracking in resolution extracted from lighting increased, and
RGB-D Images and Jeong Yang, environments,
2020 real-time, s, under depth images levels, slight
Fingertip Detection Soo-Hyung incorporating
allowing users normal using Kinect V2. background confusion
Kim, Guee a wider range
to control a lighting Hand contours s, and occurred with
Sang Lee of gestures.
virtual mouse and are obtained distances. rapid fingertip
45 | P a g e
Performanc
e
different
decreased
condition
on a screen with more movements,
s,
without using through a border- people especially for
including
gloves or tracing algorithm. tracked more complex
complex
markers. (accuracy gestures.
backgrou
dropped to
nds.
53.35% for
six people).
High
acceptance
urther
Introduced rate from
Uses a large development
Handpad, a User tests users, with
multi-touch for seamless
Clerc- Journa virtual mouse conducte positive Performance
surface (tablet) to integration
Manne l system for d with 31 feedback expectancy
track hand into smart
A Virtual Mouse for Taing, Pei- Article smart homes participan on ease of was lower,
movements for home
R12 Controlling Laptops Luen BY that integrates ts in two use and possibly due
controlling a environments
in a Smart Home Patrick Rau, SPRIN touchscreens scenarios: practicality, to the hedonic
laptop's onscreen and additional
Hanjing GER with hand living but nature of the
cursor in smart studies for
Huang 2017 movements to room and moderate tasks.
home refining
control bedroom. performanc
environments. interaction
devices. e
techniques.
expectation
s.
Developed a :5 Uses a single The system Plan to
The system's
virtual mouse participan camera to track required improve the
processing
system using a ts the user’s more time image
A Virtual Mouse Takashi speed needs
SPRIN twisting finger performe fingertip and to processing
System Using Finger- Kihara, improvement
R13 DER gesture to d a task of detects twisting complete speed using
Gestures of Twisting- Makio to handle
2011 control mouse pointing gestures to tasks CUDA
in Ishihara more complex
events. The and generate mouse compared technology to
tasks
system allows clicking events like to a regular reduce delays
efficiently.
for intuitive within clicking and mouse due and enhance
46 | P a g e
to
rectangle processing
interactions s 25 times delays, but
like twisting using this it showed
and pushing, system, a potential
dragging. usability.
simulating traditiona for tasks
real-world l mouse, with a
actions. and a lower index
touchpad of
difficulty.
Developed a
Achieved Only static
real-time
CNN-based 99.8% gestures are
gesture-based 19,852
gesture gesture supported; Extend to
HCI system images of
A Real-time Hand recognition with recognition Challenges dynamic
using a CNN 16
Gesture Recognition preprocessing, accuracy; with gesture
for recognizing gesture
R14 and Human- Pei Xu 2017 Kalman filter for Smooth background recognition
gestures and types
Computer mouse tracking, mouse removal and and complex
controlling collected
Interaction System probabilistic control transient HCI/HRI
mouse/keyboa from 5
model to avoid using gesture systems
rd with a users
false gestures Kalman detection
monocular
filter remain
camera.
Smooth
Gesture-based Limited to
RGB to YCbCr mouse Reduce lag,
mouse control static
Not conversion, events, but add features
R15 Gesture Recognition Rachit Puri 2017 using color gestures;
specified MATLAB distance like zoom,
Based Mouse Events caps for finger relies on color
implementation affects shutdown
tracking markers
accuracy
R16 Hand Gesture H. 2014 A dynamic A Image The The system Implement
Recognition Renuka, hand gesture database Acquisition: Han system is effective real-time
System to Control B. Goutam recognition of 5 hand d gestures are successful for static gesture
Soft Front Panels system that gestures, captured using ly gestures but recognition
47 | P a g e
recognize
s gestures
allows users
and lacks real-
to control
executes time
media player with 40
correspon recognition
operations images
ding capabilities
through hand per
operations and is
movements, gesture, a webcam. capabilities.
(e.g., play, sensitive to
enhancing totaling
pause) environmenta
Human- 200
based on l factors like
Computer images.
the lighting
Interaction
number of conditions.
(HCI).
matched
patterns.
Implement
Image
Viraj real-time
acquisition,
Shinde, recognition,
Developed preprocessing, Lacks real-
Hand Gesture Tushar improve
hand gesture feature time
Recognition Bacchav, Not Not robustness to
R17 2014 recognition extraction using capability,
System Using Jitendra specified quantified environmenta
system for adaptive color sensitive to
Camera Pawar, l factors,
HCI HSV model and lightin
Mangesh explore
motion history
Sanap advanced ML
image
techniques
R18 Hand Gesture Akshaya Not The system Not Uses OpenCV Accuracy Provides a Incorporate
Recognition Based Ramacha specifi successfully explicitly for video metrics more natural more
Presentation ndran, ed uses hand mentione capture, not and engaging gestures, add
System Raksha gestures to d CVZone.HandTr provided, way to voice
Aruloli, control ackingModule but the control interface,
Taran PowerPoint for hand system is presentations improve
Akshay presentations detection, and described compared to cross-
. NumPy for as working traditional platform
mathematical effectively input compatibility
operations. for basic devices. May
slide be
48 | P a g e
particularly
navigation
useful for
and
those with
annotation
limited
tasks.
mobility.
R21 Mouse controlled Dhananjay IJRPR The proposed the The methodology The system Current While the
using hand gestures Rathod, March system enables project of this project demonstrat limitations system works
recognization Sujal 2023 controlling a focuses involves several ed superior include minor effectively,
Shinde, computer's on real- stages, from accuracy inaccuracies in future research
Pronit mouse time capturing hand compared right-click could enhance
Ghosh, functions video gestures via to other functions and the finger-tip
Karthika using hand input webcam to models. difficulties in detection
Thevar, gestures using a mapping them However, precise text algorithm to
Sangita instead of a webcam onto mouse issues were selection improve
Bhoyar physical for functions.System identified through
mouse. gesture Overview: with dragging.
The AI-based recognitio specific
virtual mouse n. The objective is functions
system showed to develop an AI- like right-
higher based virtual click and
49 | P a g e
mouse system that
uses hand
gestures as input
accuracy and for cursor
performed movements and
drag-and-
better than clicks, eliminating
select
existing the need for a
operations.
models. physical mouse
.
Gesture
Recognition: The
The study system detects The virtual The proposed
real-time
Interna proposes a hand movements mouse system has the
video The study
tional virtual mouse using a camera, demonstrat potential to
input highlights
Journa system extracts features ed make human-
Meenatchi from a challenges
l of controlled by using computer promising computer
R., Nandan webcam such as the
Virtual Mouse Using Curren hand gestures vision techniques, performanc interaction
C., serves as limited
Hand t to provide a and employs e, more
R22 Swaroop the accuracy of
GestureLink/DOI: Scienc more natural machine learning achieving accessible,
H.G., primary certain
IJCSPUB e and ergonomic algorithms (e.g., accurate especially for
Varadharaj data gestures
(IJCSP way of convolutional gesture users with
u S. source for computer
UB),2, interacting neural networks recognition physical
gesture vision
May with or decision trees) and smooth impairments.
recognitio algorithms.
2023 computers. to classify cursor
n
gestures. control.. .
51 | P a g e
Enhanced
The
The system gesture
system
. performed support:
relies on Detect hand
well across The system Introduce
G. N. real-time landmarks and
It achieves various might face more advanced
Srinivas, S. input draw bounding
cursor light challenges gestures for
Sanjay from a boxes around
movement, left conditions, with more additional
Pratap, V. webcam, hands.Identify
and right achieving complex control.
Virtual Mouse S. IJRET with which fingers are
clicks, drag- high gestures or
R25 Control Using Hand Subrahman Feb gestures up or down to
and-drop, and accuracy in large Adaptation in
Gesture Recognition yam, K. G. 2023 detected trigger specific
volume detecting variations in dynamic
Nagapriya, on-the-fly mouse
adjustments gestures hand size environments:
A. Venkata rather actions.Cursor
with real-time and among users. Improve
Srinivasa than using movement,
hand executing robustness to
Rao a left/right click,
recognition. mouse . lighting and
predefine and drag-and-
functions. background
d dataset.
variations.
.
54 | P a g e
55 | P a g e
References
1. Kansal, S. (2023). Volume Control Feature for Gesture Recognition Systems. IEEE.
2. Chitra, R., Prathyusha Reddy, A., & Bamini, A. (2022). Hand Gesture Recognition using
Shapebased Image Processing Techniques for Music Control. IEEE (ICOSEC).
3. Shruti Kansal, (2023). Gesturebased Volume Control using OpenCV, MediaPipe, and
Pycaw.
4. Chitra, R., et al. (2022). Real-Time Gesture Recognition for Media Control with Shape-
based Processing.
5. Woo, J.-H., & Kim, J.-H. (2020). Environmental Challenges in Gesture Recognition
Systems.
6. Maiga Chang, et al. (2020). Custom Gesture Datasets for Virtual Control Systems.
7. Ekaterina Kochmar, et al. (2021). Standardization Challenges in Gesture Recognition. 8.
Michael Timms, (2016). Machine Learning for Gesture Interfaces.
8. Vujinovic, A., & Luburić, N. (2024). AI-based Gesture Control for Public Spaces.
9. Hendro Margono, et al. (2024). AI’s Role in Gesture Recognition Usability Studies.
10. Gan, W., et al. (2019). Gesturebased Control Techniques: A Review.
11. Bai, X., & Stede, M. (2022). Advanced Machine Learning for Gesture Recognition.
12. Prada, M.A., et al. (2020). Engineering Use Cases for Gesture Control.
13. Bassner, P., & Frankford, E. (2024). Touchless Interaction for Smart Devices.
14. Seo, K., et al. (2021). Impact of Lighting Conditions on Gesture Control Systems.
15. Tan, D.Y., & Cheah, C.W. (2021). Developing Virtual Mouse Systems using OpenCV.
16. Kim, W.-H., & Kim, J.-H. (2020). Real-Time AI Systems for Gesture Recognition.
17. Lin, C.-C., et al. (2023). Gesture Systems for Sustainable Computing Environments.
18. Brue, R., et al. (2024). Haptic Feedback in Gesture-based Systems.
19. Gan, W., & Sun, Y. (2019). Adaptive Interfaces for Gesture Control.
20. Morris, W., et al. (2024). Evaluating Performance Metrics for Virtual Mouse Systems.
21. Seo, K., & Yoon, D. (2021). AIbased Gesture Systems for Online Learning.
22. Yang, D.Y., & Nagashima, T. (2021). Improving Public
23. Interfaces with Gesture Controls.
24. Arnau-González, P., et al. (2023). Natural Language and Gesture Systems.
25. Margono, H., et al. (2024). Analyzing Gesture Control in Educational Settings.
26. Bassner, P., et al. (2024). Gesture Control for Software Development Tools.
27. Timms, M.J. (2016). Smart Classrooms and Gesture-based Interfaces.
28. Nye, B.D. (2014). Global Trends in AI-based Gesture Systems.
29. Minn, S. (2022). AI-assisted Knowledge and Gesture Systems.
30. Arnau, D., et al. (2023). Integrating Gesture Recognition into Learning Systems
56 | P a g e
57 | P a g e
Problem
Identification
58 | P a g e
VIRTUAL MOUSE USING HAND GESTURE
Problem Statement
With the rapid evolution of technology, traditional input methods such as keyboards and mice
may no longer be efficient or accessible for all users. There is a growing demand for more
intuitive, contactless, and accessible ways of interacting with digital systems. Hand gesture
recognition offers a potential solution by enabling users to control computers through natural
gestures. This technology is particularly useful for scenarios requiring hands-free control, such as
healthcare, virtual reality environments, or for people with disabilities who find conventional
input devices challenging to use.
Purpose
The purpose of this project is to design and implement a system that allows users to control
computer functions using hand gestures. This approach aims to provide an innovative, user-
friendly, and efficient alternative to traditional input devices. It seeks to demonstrate how human
gestures can seamlessly translate into digital commands, enhancing user experience and
accessibility in a variety of applications.
Scope
This project will focus on developing a prototype that uses a camera or sensors to capture hand
movements and recognize specific gestures. The system will be integrated with basic computer
functionalities, such as navigating files, controlling multimedia, or browsing the web.
Additionally, the project will explore various machine learning models and algorithms for
gesture recognition, ensuring high accuracy and responsiveness.
Features
59 | P a g e
3. Real-Time Processing: Ensure low-latency gesture recognition for smooth interaction.
4. Integration with Existing Systems: Compatible with Windows and macOS operating systems.
5. Camera-Based Control: Operates with standard webcams or dedicated sensors like Leap
Motion.
Advantages
1. Accessibility: Offers hands-free control for individuals with limited mobility or disabilities.
2. Hygienic Use: Ideal for medical and industrial environments where touchless control is
needed.
3. Natural Interaction: Mimics human communication and movement, reducing the learning
curve.
Disadvantages
1. Limited Gesture Set: Complex gestures may be difficult to implement or recognize accurately.
2. Environmental Constraints: Lighting and background noise can affect camera performance.
3. Processing Power: Real-time gesture recognition may require high computational resources.
4. Dependency on Hardware: Performance may vary based on the quality of the camera or
sensor.
60 | P a g e
Project
Portfolio
61 | P a g e
Name of student :Sakshi Navanath Rale
Semester: 5th
Semester: 5th
62 | P a g e
Name of student : Mahek Rauf Fodkar
Semester: 5th
63 | P a g e
After Finalization of Project Topic & Formation of Project Team
Ans. We thought of 3 alternatives before finalizing the project topic Email spam filtration, E-
commerce Smartphone Application, Placement Cell - A Web based Application.
2. Did we consider all the technical fields related to branch of our diploma programme?
Ans. Holland’s career concept implementation was more interesting. Hence, we decided to
make a project on stream analysis using his model.
4. Whether all the group members agreed on the present project topic? If not? What were
the reasons of their disagreement?
5. Whether the procedure followed in assessing alternatives and finalizing the project topic
was correct? If not then discuss the reasons.
Ans. Team was formed during starting weeks of the term, as all of us were together in
previous micro projects too, our ability to cooperate and coordinate was good
8. Whether we faced any problem in forming the team? If yes, then what was the problem
and how was it resolved?
9. Am I the leader of our project team? If yes, then why was I chosen? If not, why I could
not become the project team leader?
64 | P a g e
10. Do I feel that present team leader is the best choice available in the group? If yes, then
why? If not then why?
11. According to me who should be the leader of the team and why?
Ans. All of us works well, I was chosen by the agreement of other team members
12. Can we achieve the targets set in the project work within the time and cost limits?
Ans. We will try our best to complete the project within the provided time and cost limits
13. What are my good/bad sharable experiences while working with my team which
provoked me to think? What I learned from these experiences?
Ans. The experience was well and good, all of the team members cooperated and worked
together. I learned about website development, working in team environment and working
with AI.
14. Any other reflection which I would like to write about formation of team and finalization
of project title, if any?
Ans. No
1. Which activities are having maximum risk and uncertainty in our project plan?
Ans. Finding the questionnaire and keys was the most important activity
3. Is work distribution is equal project group members? If not? What are the reasons? How
we can improve work distribution?
Ans. Yes
4. Is it possible to complete the project in given time? If not then what are the reasons for it?
How can we ensure that project is completed within time?
Ans. Yes
65 | P a g e
5. What extra care and precaution should be taken in executing the activities of high risk
and uncertainty? If possible, how such risks and uncertainties can be reduced?
Ans. Taking care of the database and maintaining confidentiality, integrity of data
6. Can we reduce the total cost associated with the project? If yes, then describe the ways.
Ans. Yes, by dividing work equally and preparing the project as early as possible
7. For which activities of our project plan, arrangement of resources is not easy and
convenient?
8. Did we make enough provisions of extra time/expenditure etc. to carry out such
activities?
Ans. Yes
9. Did we make enough provisions for time delays in our project activity? In which
activities there are more chances of delay?
Ans. No. Until now we haven’t find such activities which involves more delays.
10. In our project schedule, which are the days of more expenditure? What provisions we
have made for availability and management of cash?
11. Any other reflection which I would like to write about project planning?
Ans. No
66 | P a g e
67 | P a g e
Project
Logbook
68 | P a g e