0% found this document useful (0 votes)
702 views18 pages

Understanding Haptic Technology

Haptic technology adds the sense of touch to virtual environments by allowing users to feel virtual objects on a computer screen. It uses haptic interfaces like joysticks or data gloves connected to motors that provide force feedback to the user. This allows for training in tasks requiring hand-eye coordination like surgery. Haptic rendering algorithms compute interaction forces between a virtual representation of the haptic device and virtual objects to realistically simulate touch. Collision detection determines when and where virtual objects collide while force response algorithms calculate the contact forces felt by the user.

Uploaded by

Mamatha Marri
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
702 views18 pages

Understanding Haptic Technology

Haptic technology adds the sense of touch to virtual environments by allowing users to feel virtual objects on a computer screen. It uses haptic interfaces like joysticks or data gloves connected to motors that provide force feedback to the user. This allows for training in tasks requiring hand-eye coordination like surgery. Haptic rendering algorithms compute interaction forces between a virtual representation of the haptic device and virtual objects to realistically simulate touch. Collision detection determines when and where virtual objects collide while force response algorithms calculate the contact forces felt by the user.

Uploaded by

Mamatha Marri
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 18

HAPTIC TECHNOLOGY

ABSTRACT
HAPTICS-- a technology that adds the sense of touch to virtual environment. Haptic interfaces allow the user to feel as well as to see virtual objects on a computer, and so we can give an illusion of touching surfaces, shaping virtual clay or moving objects around. The sensation of touch is the brains most effective learning mechanism --more effective than seeing or hearingwhich is why the new technology holds so much promise as a teaching tool. Haptic technology is like exploring the virtual world with a stick. If you push the stick into a virtual balloon push back .The computer communicates sensations through a haptic interface a stick, scalpel, racket or pen that is connected to a force-exerting motors. With this technology we can now sit down at a computer terminal and touch objects that exist only in the "mind" of the computer.By using special input/output devices (joysticks, data gloves, or other devices), users can receive feedback from computer applications in the form of felt sensations in the hand or other parts of the body. In combination with a visual display, haptics technology can be used to train people for tasks requiring hand-eye coordination, such as surgery and space ship maneuvers. In this paper we explicate how sensors and actuators are used for tracking the position and movement of the haptic device moved by the operator. We mention the different types of force rendering algorithms. Then, we move on to a few applications of Haptic Technology. Finally we conclude by mentioning a few future developments.

r.com

www.recruitride

Introduction : What is Haptics?


Haptics refers to sensing and manipulation through touch. The word comes from the Greek haptesthai, meaning to touch. The history of the haptic interface dates back to the 1950s, when a master-slave system was proposed by Goertz (1952). Haptic interfaces were established out of the field of tele- operation, which was then employed in the remote manipulation of radioactive materials. The ultimate goal of the tele-operation system was "transparency". That is, an user interacting with the master device in a master-slave pair should not be able to distinguish between using the master controller and manipulating the actual tool itself. Early haptic interface systems were therefore developed purely for telerobotic applications. Haptics, the science of touch, lets computer users interact with virtual worlds by feel. Some commercial computer games already benefit from early haptic devices, like the force-feedback steering wheels that torque and vibrate on bumpy driving-game roads. But haptics isn't all fun and games. Scientists use computers to simulate not only the impact of a golf club hitting the ball, but also the springiness of a kidney under forceps, the push of an individual carbon nanotube in an atomic force microscope and the texture of clothing for sale on the Internet.

r.com

www.recruitride

Fig. A device that lets computer users feel and manipulate the objects depicted on their screens.

The Haptics Continuum


As a field of study, haptics has closely paralleled the rise and evolution of automation. Before the industrial revolution, scientists focused on how living things experienced touch. Biologists learned that even simple organisms, such as jellyfish and worms, possessed sophisticated touch responses. In the early part of the 20th century, psychologists and medical researchers actively studied how humans experience touch. Appropriately so, this branch of science became known as human haptics, and it revealed that the human hand, the primary structure associated with the sense of touch, was extraordinarily complex. With 27 bones and 40 muscles, including muscles located in the forearm, the hand offers tremendous dexterity. Scientists quantify this dexterity using a concept known as degrees of freedom. A degree of freedom is movement afforded by a single joint. Because the human hand contains 22 joints, it allows movement with 22 degrees of freedom. The skin covering the hand is also rich with receptors and nerves, components of the nervous system that communicate touch sensations to the brain and spinal cord.

r.com

www.recruitride

HIRO, a haptic interface robot, helps a user feel a dinosaur during the Prototype Robot Exhibition at the 2005 World Exposition in Japan.

Then came the development of machines and robots. These mechanical devices also had to touch and feel their environment, so researchers began to study how this sensation could be transferred to machines. The era of machine haptics had begun. The earliest machines that allowed haptic interaction with remote objects were simple lever-and-cableactuated tongs placed at the end of a pole. By moving, orienting and squeezing a pistol grip, a worker could remotely control tongs, which could be used to grab, move and manipulate an object. In the 1940s, these relatively crude remote manipulation systems were improved to serve the nuclear and hazardous material industries. Through a machine interface, workers could manipulate toxic and dangerous substances without risking exposure. Eventually, scientists developed designs that replaced mechanical connections with motors and electronic signals. This made it possible to communicate even subtle hand actions to a remote manipulator more efficiently than ever before. The next big advance arrived in the form of the electronic computer. At first, computers were used to control machines in a real environment (think of the computer that controls a factory robot in an auto assembly plant). But by the 1980s, computers could generate virtual environments -- 3-D worlds into which users could be cast. In these early virtual environments, users could receive stimuli through sight and sound only. Haptic interaction with simulated objects would remain limited for many years. Then, in 1993, the Artificial Intelligence Laboratory at the Massachusetts Institute of Technology (MIT) constructed a device that delivered haptic stimulation, finally making it possible to touch and feel a computer-generated object. The scientists working on the

r.com

www.recruitride

project began to describe their area of research as computer haptics to differentiate it from machine and human haptics. Today, computer haptics is defined as the systems required -both hardware and software -- to render the touch and feel of virtual objects. It is a rapidly growing field that is yielding a number of promising haptic technologies. Somes of the such technologies are as follow.

Working of Haptic Devices Architecture for Haptic feedback

Basic architecture for a virtual reality application incorporating visual, auditory, and haptic feedback.

Simulation engine:
Responsible for computing the virtual environments behavior over time.

Visual, auditory, and haptic rendering algorithms:


Compute the virtual environments graphic, sound, and force responses toward the user.

Transducers:
Convert visual, audio, and force signals from the computer into a form the operator can perceive. Rendering:

r.com

www.recruitride

Process by which desired sensory stimuli are imposed on the user to convey information about a virtual haptic object. The human operator typically holds or wears the haptic interface device and perceives audiovisual feedback from audio (computer speakers, headphones, and so on) and visual displays (a computer screen or head-mounted display, for example). Audio and visual channels feature unidirectional information and energy flow (fromthe simulation engine towards the user) whereas, the haptic modality exchanges information and energy in two directions, from and toward the user. This bi directionality is often referred to as the single most important feature of the haptic interaction modality.

System architecture for haptic rendering


An avatar is the virtual representation of the haptic interface through which the user physically interacts with the virtual environment. Haptic-rendering algorithms compute the correct interaction forces between the haptic interface representation inside the virtual environment and the virtual objects populating the environment. Moreover, haptic rendering algorithms ensure that the haptic device correctly renders such forces on the human operator.

r.com

www.recruitride

Collision-detection algorithms detect collisions between objects and avatars in the virtual environment and yield information about where, when, and ideally to what extent collisions (penetrations, indentations, contact area, and so on) have occurred. Force-response algorithms compute the interaction force between avatars and virtual objects when a collision is detected. This force approximates as closely as possible the contact forces that would normally arise during contact between realobjects. Hardware limitations prevent haptic devices from applying the exact force computed by the force-response algorithms to the user. Control algorithms command the haptic device in such a way that minimizes the error between ideal and applicable forces. The discrete-time nature of the hapticrendering algorithms often makes this difficult. The force response algorithms return values are the actual force and torque vectors that will be commanded to the haptic device.

r.com

www.recruitride

Existing haptic rendering techniques are currently based upon two main principles: "pointinteraction" or "ray-based". In point interactions, a single point, usually the distal point of a probe, thimble or stylus employed for direct interaction with the user, is employed in the simulation of collisions. The point penetrates the virtual objects, and the depth of indentation is calculated between the current point and a point on the surface of the object. Forces are then generated according to physical models, such as spring stiffness or a spring-damper model. In ray-based rendering, the user interface mechanism, for example, a probe, is modeled in the virtual environment as a finite ray. Orientation is thus taken into account, and collisions are determined between the simulated probe and virtual objects. Collision detection algorithms return the intersection point between the ray and the surface of the simulated object.

Computing contact-response forces


Humans perceive contact with real objects through sensors (mechanoreceptors) located in their skin, joints, tendons, and muscles. We make a simple distinction between the information these two types of sensors can acquire. Tactile information refers to the information acquired through sensors in the skin with particular reference to the spatial distribution of pressure, or more generally, tractions, across the contact area. To handle flexible materials like fabric and paper, we sense the pressure variation across the fingertip. Tactile sensing is also the basis of complex perceptual tasks like medical palpation, where physicians locate hidden anatomical structures and evaluate tissue properties using their hands. Kinesthetic information refers to the information acquired through the sensors in the joints. Interaction forces are normally perceived through a combination of these two. To provide a haptic simulation experience, systems are designed to recreate the contact forces a user would perceive when touching a real object.

r.com

www.recruitride

There are two types of forces: 1.Forces due to object geometry. 2.Forces due to object surface properties, such as texture and friction.

Geometry-dependent-force-rendering algorithms
The first type of force-rendering algorithms aspires to recreate the force interaction a user would feel when touching a frictionless and textureless object. Force-rendering algorithms are also grouped by the number of Degrees-of-freedom (DOF) necessary to describe the interaction force being rendered.

Surface property-dependent force-rendering algorithms


All real surfaces contain tiny irregularities or indentations. Higher accuracy, however, sacrifices speed, a critical factor in real-time applications. Any choice of modeling technique must consider this tradeoff. Keeping this trade-off in mind, researchers have developed more accurate haptic-rendering algorithms for friction. In computer graphics, texture mapping adds realism to computer-generated scenes by projecting a bitmap image onto surfaces being rendered. The same can be done haptically.

Controlling forces delivered through haptic interfaces


Once such forces have been computed, they must be applied to the user. Limitations of haptic device technology, however, have sometimes made applying the forces exact value as computed by force-rendering algorithms impossible. They are as follows: Haptic interfaces can only exert forces with limited magnitude and not equally well in all directions

r.com

www.recruitride

Haptic devices arent ideal force transducers. An ideal haptic device would render zero impedance when simulating movement in free space, and any finite impedance when simulating contact with an object featuring such impedance characteristics. The friction, inertia, and backlash present in most haptic devices prevent them from meeting this ideal. A third issue is that haptic-rendering algorithms operate in discrete time whereas users operate in continuous time.

Finally, haptic device position sensors have finite resolution. Consequently, attempting to determine where and when contact occurs always results in a quantization error. It can create stability problems. All of these issues can limit a haptic applications realism. High servo rates (or low servo rate periods) are a key issue for stable haptic interaction. .

Haptic Devices Types of Haptic devices


There are two main types of haptic devices:

r.com

www.recruitride

Devices that allow users to touch and manipulate 3-dimentional virtual objects. Devices that allow users to "feel" textures of 2-dementional objects. Another distinction between haptic interface devices is their intrinsic mechanical behavior. Impedance haptic devices simulate mechanical impedancethey read position and send force. Simpler to design and much cheaper to produce, impedance-type architectures are most common. Admittance haptic devices simulate mechanical admittancethey read force and send position. Admittance-based devices are generally used for applications requiring high forces in a large workspace.

LOGITECH WINGMAN FORCE FEEDBACK MOUSE


It is attached to a base that replaces the mouse mat and contains the motors used to provide forces back to the user. Interface use is to aid computer users who are blind or visually disabled; or who are tactile/Kinesthetic learners by providing a slight resistance at the edges of windows and buttons so that the user can "feel" the Graphical User Interface (GUI). This technology can also provide resistance to textures in computer images, which enables computer users to "feel" pictures such as maps and drawings.

PHANTOM r.com www.recruitride

The PHANTOM provides single point, 3D force- feedback to the user via a stylus (or thimble) attached to a moveable arm. The position of the stylus point/fingertip is tracked, and resistive force is applied to it when the device comes into 'contact' with the virtual model, providing accurate, ground referenced force feedback. The physical working space is determined by the extent of the arm, and a number of models are available to suit different user requirements. The phantom system is controlled by three direct current (DC) motors that have sensors and encoders attached to them. The number of motors corresponds to the number of degrees of freedom a particular phantom system has, although most systems produced have 3 motors. The encoders track the users motion or position along the x, y and z coordinates the motors track the forces exerted on the user along the x, y and z-axis. From the motors there is a cable that connects to an aluminum linkage, which connects to a passive gimbals which attaches to the thimble or stylus. A gimbal is a device that permits a body freedom of motion in any direction or suspends it so that it will remain level at all times. Used in surgical simulations and remote operation of robotics in hazardous environments

PHANTOM

Cyber Glove r.com www.recruitride

Cyber Glove can sense the position and movement of the fingers and wrist. The basic Cyber Glove system includes one Cyber Glove, its instrumentation unit, serial cable to connect to your host computer, and an executable version of VirtualHand graphic hand model display and calibration software.

The CyberGlove has a software programmable switch and LED on the wristband to permit the system software developer to provide the CyberGlove wearer with additional input/output capability. With the appropriate software, it can be used to interact with systems using hand gestures, and when combined with a tracking device to determine the hand's position in space, it can be used to manipulate virtual objects.

Cyber Grasp

r.com

www.recruitride

The Cyber Grasp is a full hand force-feedback exo skeletal device, which is worn over the CyberGlove. CyberGrasp consists of a lightweight mechanical assembly, or exoskeleton, that fits over a motion capture glove. About 20 flexible semiconductor sensors are sewn into the fabric of the glove measure hand, wrist and finger movement. The sensors send their readings to a computer that displays a virtual hand mimicking the real hands flexes, tilts, dips, waves and swivels. The same program that moves the virtual hand on the screen also directs machinery that exerts palpable forces on the real hand, creating the illusion of touching and grasping. A special computer called a force control unit calculates how much the exoskeleton assembly should resist movement of the real hand in order to simulate the onscreen action. Each of five actuator motors turns a spool that rolls or unrolls a cable. The cable conveys the resulting pushes or pulls to a finger via the exoskeleton.

APPLICATIONS

Gaming-Technology
Flight Simulations: Motors and actuators push, pull, and shake the flight yoke, throttle, rudder pedals, and cockpit shell, replicating all the tactile and kinesthetic cues of real flight. Some examples of the simulators haptic capabilities include resistance in the yoke from

r.com

www.recruitride

pulling out of a hard dive, the shaking caused by stalls, and the bumps felt when rolling down concrete runway. These flight simulators look and feel so real that a pilot who successfully completes training on a top-of-the-line Level 5 simulator can immediately start flying a real commercial airliner. Today, all major video consoles have built-in tactile feedback capability. Various sports games, for example, let you feel bone-crushing tackles or the different vibrations caused by skateboarding over plywood, asphalt, and concrete. Altogether, more than 500 games use force feedback, and more than 20 peripheral manufacturers now market in excess of 100 haptics hardware products for gaming.

Mobile Phones
Samsung has made a phone, which vibrates, differently for different callers. Motorola too has made haptic phones.

Cars
For the past two model years, the BMW 7 series has contained the iDrive (based on Immersion Corp's technology), which uses a small wheel on the console to give haptic feedback so the driver can control the peripherals like stereo, heating, navigation system etc. through menus on a video screen. The firm introduced haptic technology for the X-by-Wire system and was showcased at the Alps Show 2005 in Tokyo. The system consisted of a "cockpit" with steering, a gearshift lever and pedals that embed haptic technology, and a remote-control car. Visitors could control a remote control car by operating the steering, gearshift lever and pedals in the cockpit seeing the screen in front of the cockpit, which is projected via a camera equipped on the remote control car.

Robot Control www.recruitride

r.com

For navigation in dynamic environments or at high speeds, it is often desirable to provide a sensor-based collision avoidance scheme on-board the robot to guarantee safe navigation. Without such a collision avoidance scheme, it would be difficult for the (remote) operator to prevent the robot from colliding with obstacles. This is primarily due to limited information from the robots' sensors, such as images within a restricted viewing angle without depth information, which is insufficient for the user's full perception of the environment in which the robot moves, and (2) significant delay in the communication channel between the operator and the robot. Experiments on robot control using haptic devices have shown the effectiveness of haptic feedback in a mobile robot tele operation system for safe navigation in a shared autonomy scenario.

Future Enhancements

This underlying technology automatically assigns "generic touch sensations" to common Web page objects, such as hyperlinks, buttons, and menus.

Haptic torch for the blind: r.com www.recruitride

The device, housed in a torch, detects the distance to objects, while a turning dial on which the user puts his thumb indicates the changing distance to an object. The pictured device was tested and found to be a useful tool.

CONCLUSION:
Haptic is the future for online computing and e-commerce, it will enhance the shopper experience and help online shopper to feel the merchandise without leave their home. Because of the increasing applications of haptics, the cost of the haptic devices will drop in future. This will be one of the major reasons for commercializing haptics. In video games, the addition of haptic capabilities is nice to have. It increases the reality of the game and, as a result, the user's satisfaction. But in training and other applications, haptic interfaces are vital. That's because the sense of touch conveys rich and detailed information about an object. When it's combined with other senses, especially sight, touch dramatically increases the amount of information that is sent to the brain for processing. The increase in information reduces user error, as well as the time it takes to complete a task. It also reduces the energy consumption and the magnitudes of contact forces used in a teleoperation situation. Clearly, Samsung is hoping to capitalize on some of these benefits with the introduction of the Anycall Haptic phone. Nokia will push the envelope even farther when it introduces phones with tactile touchscreens. Yes, such phones will be cool to look at. And, yes, they will be cool to touch. But they will also be easier to use, with the touch-based features leading to fewer input errors and an overall more satisfying experience.

Sources
Immersion Corporation Web Site http://www.immersion.com/ The Microdynamics System Laboratory http://www.msl.ri.cmu.edu/ http://www.sensable.com/documents/documents/Salisbury_Haptics95.pdf SensAble Technologies Web Site http://www.sensable.com/

r.com

www.recruitride

Srinivasan, Mandayam A."What is Haptics?" Laboratory for Human and Machine Haptics: Massachusetts Institute of Technology. http://www.sensable.com/hapticdevices-projects-papers.htm Wilson, Daniel H. "How Haptics Will Change the Way We Interact with Machines." Popular Mechanics. April 2008. http://www.popularmechanics.com/technology/industry/4253368.html

r.com

www.recruitride

You might also like