0% found this document useful (0 votes)
11 views2 pages

Smart Glove Project Presentation Document

The project aims to create a smart glove that recognizes hand gestures using flex sensors and an accelerometer, transmitting messages via Bluetooth to assist individuals with speech impairments. The glove processes sensor data through an Arduino microcontroller to convert gestures into text or spoken output using a TTS application on a smartphone. Applications include aiding communication, controlling robotics, smart home automation, gaming, and educational tools.

Uploaded by

CrypticsYT
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views2 pages

Smart Glove Project Presentation Document

The project aims to create a smart glove that recognizes hand gestures using flex sensors and an accelerometer, transmitting messages via Bluetooth to assist individuals with speech impairments. The glove processes sensor data through an Arduino microcontroller to convert gestures into text or spoken output using a TTS application on a smartphone. Applications include aiding communication, controlling robotics, smart home automation, gaming, and educational tools.

Uploaded by

CrypticsYT
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

Smart Glove for Gesture-Based Communication

Objective
The objective of this project is to design and implement a smart glove capable of
recognizing hand gestures using flex sensors and an accelerometer, and transmitting the
corresponding messages wirelessly via Bluetooth. The system aims to assist individuals
with speech impairments by converting gestures into text or spoken output.

Function
The glove integrates four flex sensors, an MPU6050 accelerometer, and an HC-06 Bluetooth
module connected to an Arduino Uno microcontroller. The flex sensors detect finger
bending, while the accelerometer measures the orientation and motion of the hand. The
Arduino processes these sensor readings to recognize gestures and sends the
corresponding messages through the HC-06 module to a smartphone, where a text-to-
speech (TTS) application converts them into audible speech.

Explanation of the Code


1. The Arduino initializes serial communication with both the computer and the Bluetooth
module using the SoftwareSerial library.
2. The analog signals from the four flex sensors are continuously read and stored in
variables. Each reading is compared to a threshold value to determine whether a finger is
bent or straight.
3. The MPU6050 accelerometer is initialized via the I2C interface using the Wire and
Adafruit_MPU6050 libraries. It provides acceleration data on the X, Y, and Z axes, which
indicate the glove’s orientation or motion.
4. Logical conditions (if-else statements) combine the flex sensor and accelerometer data to
classify gestures. For example, a single bent finger or a tilt in a certain direction corresponds
to a specific phrase.
5. Once a gesture is recognized, it is sent via Bluetooth to the paired smartphone using
BTSerial.println(). A debounce delay ensures that gestures are not transmitted repeatedly
in rapid succession.
6. On the smartphone, the received message is converted into speech using a TTS app,
allowing the user to hear the output corresponding to the performed gesture.

Applications
1. Communication aid for people with speech or hearing impairments, converting hand
gestures into speech.
2. Gesture-controlled robotics, allowing users to operate robots or drones through simple
hand movements.
3. Smart home automation, where gestures can control lights, fans, or appliances.
4. Gaming and virtual reality interfaces for immersive control experiences.
5. Educational and rehabilitation tools to assist individuals in learning or restoring motor
coordination.

You might also like