0% found this document useful (0 votes)
5 views7 pages

Smahmud HMI Review

Uploaded by

Brian Rivera
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views7 pages

Smahmud HMI Review

Uploaded by

Brian Rivera
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/339903863

Interface for Human Machine Interaction for assistant devices: A Review

Conference Paper · January 2020


DOI: 10.1109/CCWC47524.2020.9031244

CITATIONS READS
75 1,376

3 authors:

Saifuddin Mahmud Xiangxu Lin


Kent State University Kent State University
31 PUBLICATIONS 740 CITATIONS 14 PUBLICATIONS 190 CITATIONS

SEE PROFILE SEE PROFILE

Jong-Hoon Kim
Kent State University
75 PUBLICATIONS 1,058 CITATIONS

SEE PROFILE

All content following this page was uploaded by Saifuddin Mahmud on 19 March 2020.

The user has requested enhancement of the downloaded file.


Interface for Human Machine Interaction for
assistant devices: A Review
Saifuddin Mahmud Xiangxu Lin Jong-Hoon Kim*
Computer Science Computer Science Computer Science
Kent State University Kent State University Kent State University
[email protected] [email protected] [email protected]

Abstract—Interface for the Human Machine Interaction has technology comprise there parts, namely catching human
become a prominent area of research due to the rapid growth intention, processing and provide appropriate signal to the
of automation and robotics in the recent few decades. Although machine. The intention is caught by brain signal, facial expres-
an abundance of frameworks have been emerged to make the
interaction between human and machine easy and robust, a sion, speech, gesture, eye movement, lip movement, walking
substantial portion of these is not flourished in their scope. pattern, head movement, hand and finger movement. Then the
Through this review, we have tried to reveal different types of intention is processed in the computer system and then the
interfacing technique between human and machine to exhibit actual instruction which was conveyed by the intention is sent
the evolution of the related technologies in developing assistant to the machine for execution.
devices. This review explores the contemporary groundbreaking
technologies developed for this purpose and their advantages and The motive of the of this review work is to concentrate
limitations. An outline is drawn for forthcoming development on the interfaces which already have been flourished , find
in the field of human machine interaction, human computer out their specialty and drawbacks, come up with a direction
interaction and human robot interaction by using this review for the subsequent research in the field of human machine
. This review draws a broad perspective on status quo and interfacing for developing assistant devices.
perspective on the possible future development of interfaces of
assistant devices in the field of Human machine interaction. A relative study with existing review works in the field of
Index Terms—Brain Signal, Eye Gaze, Gesture, Telepresence, Human machine interaction has been conducted to ratify the
EEG, Mind Band novelty of our review. Parmar Prashant and Anand Joshi [1]
reviewed the brain computer interface which was based on
I. I NTRODUCTION the brain signal. The authors categorized the system based
The term Human computer interface (HCI) is used to on brain neuron signal imaging. This review illustrated the
describe the interaction between human(user) and computers. associated techniques that are widely used in brain signal inter-
This is also the method by which a user tells the computer facing. Leonard J. Haasbroek reviewed [2] Human-Computer
what to perform, and the reaction which the computer gen- Interfaces and Intent Support. In this review several types of
erates. In addition, HCI is about designing and developing interfaces for human machine interaction were discussed and
computer systems to support people’s need of using machine, some of the future interface technologies which can be used
so that they can carry out their activities prolifically with for designing assistant devices were discussed.
safety. A Study was conducted by Tetsuo Tezuka [3]on Space
For the visualization of a running process, Human machine Interface for Teleoperation System. In this study authors only
interface is the wide platform which is extensively used in focused on the interface for teleoperations which is only based
the control systems for providing interaction. Human machine on haptics. Vladimir and Sharma [4] reviewed different types
interaction solution can be developed on simple Light Emitting of human machine interface which was based on hand gesture.
Diode(LED) indicator or using thin film transistor(TFT) as The review shows how the hand gestures put up an desirable
graphical user interface(GUI) which are the approaches for alternative to complicated interface devices for human machine
information transmission. interaction, specially for the people who have disabilities. In
There are different types interfaces to perform communi- addition, it shows that, visual exposition of hand gestures
cation between human and machine. For example, tangible can be beneficial in obtaining the ease and ingenuousness
and nontangible interfaces. The tangible interfaces comprise for human machine interaction. However, all the assistive
mouse, keyboard, touch screen etc. and the nontangible inter- technology like head movement-based technique and their per-
face comprises vision based, gesture based, facial expression formance was not demonstrated. Al-Rahayfeh and Faezipour
based, eye gaze based etc. Many interfaces have been devel- [5] conducted a survey on interface based on Eye Tracking.
oped for difference purposes like telepresence, teleoperation, In this review many approaches are introduced based on
disability assistant, space vehicle controlling. The interface eye tracking which can flourished as a platform to develop
an eye tracking system which obtains the good performance,
* corresponding author - [email protected] accuracy, and lowest cost. Performance of each assistive tech-
nology was demonstrated and their performance was analyzed. (EEG) requires special laboratories which require to be well
Most of the review papers are focused on single interface. equipped with specialized measurement devices which are
Several of them are based on only gestures, some are brain used measurement and analysis for data. Most of the cases it
signal based, a portion of them are based on eye gaze, few are takes a lot of money. Weibo Song et. al. [8] proposed a Mind
haptics based and some are based on hybrid approach. band which uses dry electrode technology which is safe and
A huge breach exists in the area of reviewing the literature non-intrusive. This device simplifies rainwave data acquisition
associated to the advancement of interface for human machine and no longer laboratory environment is not needed. This
interaction specially for assistant devices. Thus, in our review, device is cost effective and it increases the speed of EEG
we will take into account the pertinent research that are acquisition, analysis and processing. This device can be used
associated with brain signal based, gesture based, eye gaze for different purposes like teleoperations, disability assistant
based, tangible, and hybrid approaches and analyze their pros etc. A Brain Machine Interface(BMI) for stroke reclamation
and cons. with clinical studies was proposed Cuntai Guan [9] .
The aim of This interface technology was investigated very
actively to come up with a new alternative to stoke stroke
survivors so that they can restore motor operations.
The author has conducted several studies to investigate how
this interface can be used combining with the devices related
to haptics and robotics to help stroke patients for providing
rehabilitation and the result was satisfactory. Keun-Tae Kim
[10] developed a wheelchair based on robtics in conjunction
Fig. 1. Categories of the reviewed interface for HCI
with the motor imagery based brain-computer interface . In
this system a laptop was connecteed to the gTec EEG system
The paper is organized into nine sections where the Section to gather EEG signals via USB to the laptop. EEG signals are
I is dedicated for introducing the paper. The Section II depicts recorded, then filtered. After filtering features like left, right,
the fundamentals of interface for human machine interaction. forward, backward are extracted from the signals. According
Section III describes brain signal based interface. Section IV to the extracted signal instructions are send to the wheelchair
defines the gesture-based interface. In Section V eye gaze for execution.
based interface is described. Section VI explains tangible This smart wheelchair system provides a favorable and cost
interface. Section VII briefly explains the hybrid interfaces for effective and efficient user interface based on motor imagery
human machine interaction. Section VIII presents discussions BCI. If some other sensors are added to this system, then it
and future directions. Section IX concludes the review. will increase the safety of the disable persons. Disable people
who suffer from neuro muscular diseases face difficulty for
II. FUNDAMENTALS OF INTERFACE FOR HUMAN controlling the environment.
MACHINE INTERACTION. Brain Computer Interface (BCI) arranges a channel for
Cloud computing [6], Mobile computing [7], and human commutation between the machine and the brain where no
machine interaction are going to be the leading areas in the muscular activation is required.
coming days. Due to the advancement of human computer in- Aydin et. al. [11] proposed a region based BCI for a home
teraction and human machine interaction user can interact with control application for the above-mentioned people. In this
computer or machine so naturally, confidently and correctly. system a cap (actiCAP) was used to acquire the EEG was
While developing the assistant devices,interface for human recorded using then processed the signal. They have also
machine interaction is crucial aspect. A robust Human machin developed a computer application and divided the application
interface(HMI) should be multifaceted , speedy, costeffective, window in different region with different icons for controlling
adaptable and easy to understand. In the past few years , the different devices. Then the brain signals were mapped with
development of the cyber-society implied a huge deployment this icon. The accuracy was evaluated form different aspects
of advanced technologies in every sector, involving the users by different users, and it was more than 90% . This system
of all categories including child, elderly,person with physical can be used in the critical application if accuracy is increased
disability, and persons with very multifarious technical skills. by improving algorithm.
Depending on the user and application different types of
IV. GESTURE BASED INTERFACE
interfaces have been developed.
Recently, the human machine interaction become more
III. BRAIN SIGNAL BASED INTERFACE efficient and effective due to the advancement of the vision-
To control vehicle, robot, machine etc, technology related based interface. This is more user friendly, cost-effective, de-
to Brain Singnal Interface(BCI) has become a growing con- pendable and more natural way of communication and control.
cern. This area is developing very rapidly . In this purpose Therefore, a lot of research is going on the technologies which
acquisition of Electro-encephalogram (EEG) and its analysis is are based on gesture and its potential application in numerous
needed. The acquisition of the regular electro-encephalogram fields to make it as a making it a captivating tool to make
an excellent interface for controlling machines and robots. or notepad. Approximately 400 image was to validate and
Lavanya et al [12] proposed a gesture control robot which can measure the performance of the system. Finally, the result
capture the gesture by the camera, then the robotic system show that this system has accuracy more than 94%. As the
detect the correct gesture and act accordingly like stop, go system can detect a good number of gestures and provides the
left, go right and forward. This system can be used for other output of 7-bit number, by combining the gestures this system
assistant devices for disabled person but needs to process more can be used for controlling large machine which have a large
complicated gestures. Smith et. al. [13] proposed an novel number of controlling parameters. As human environments are
system for controlling the vehicular infotainment using gesture unstructured, dynamic, to make interaction with robot more
detection system which was based on mm-wave radar. At first naturally gesture based interface can play an important role.
gestures are captured by radar sensor then it is processed using A interactive Semi-Autonomous Interactive Teleop-Interface
Android based mobile phone. After processing the gestures (ITI) was proposed by Quintero et. al [17] for Controlling the
from the android based system, the appropriate instruction is arms of the robot. This system controls basically which two
fed to the infotainment system. interactions. First one is, the robot arm has the direct linkage
As the radars are not impacted by the variation of brightness to the arm motion and gesture tracked from the skeleton of the
and lights so it can recognize the orientation of hand and finger human body. The second one is an autonomous image-based
with high precision. visual servoing routine which can be prompted for accurate
The accuracy of the system was measured from all aspects positioning.
and it was above 95%. But the placement of sensor is very By using this proposed interface, the users get tracked by
crucial. the Kinect and the robotic arm is controlled accordingly by
A machine learning based new approach was proposed by tele-operation. The system also helps the people by alleviate
Magoules [14] which was a contactless human machine inter- him from conduct with the dynamics of the robot and allowing
face. Besides machine learning, prominent computer vision very accurate motions. Therefore, it can be used to perform
techniques were used. This technique provides an efficient a job in a remote place which needs small movements and
interface which detects and track user hand gestures and allows precise locatio.
to control the robots or other machines. This approach is To assist the disable people for operating the machine more
based on monocular vision that’s why it’s needs single camera easily a wireless user interface based on dynamic gesture in
for human interaction. This system can perform the basic the from of hand glove was proposed by Prasad et. al [18]. A
task of a mouse and it can control the user built onscreen hand data glove named DG5 was used to design the interface
keyboard. As the system use machine learning technique it between human and the machine. It provides 22 Degree of
can capture high quality image and parallel implementation Freedom so that people can interact with the machine so
makes sure to process the captured image to extract the correct naturally. This globe provides a linkage between the static
gesture. This system can be affected if there is an imbalance of controller and dynamic gesture of human. It maps the static
light. In some gesture-based applications like teleoperations, controller with dynamic human hand gestures with 22 Degree
surveillance robot real-time gesture detection is necessary. of Freedom (DoF) to interact more natural way with machine.
Raheja et. al. [15] through his research offered a technique for The glove based interface is more dependable and appropriate
controlling the robotic arm in real-time using hand gesture. for the collection of motion data than camera-based interfaces
In their system simple video camera is used for computer [19], [20]. This system assists the disabled person to operate
vision which capture real-time hand gesture. Then feature is machine or computer without facing any difficulty and its
extracted from the and pattern matching is performed gesture accuracy is more than 98%.
recognition. Command is generated according to the gesture
and it is executed by the robot system. The system can detect V. EYE GAZE BASED INTERFACE
11 hand gesture and accuracy is 90% if it gets proper light
arrangement but it cannot handle complex gestures. Eye gaze-based interface for human computer interaction
One of the major application of hand gesture based interface is becoming popular day by day and it is being used in
is to provide aid for the disable people. different sectors. For example, monitoring an environment
Panwar [16] proposed an interface based on Hand gesture from a remote location and controlling remotely as well are
for assisting visually impaired people was developed . This the main task in the field of tele-operation. In tele-operations
system does not include texture and color of the image, the operator’s eyes become occupied in monitoring the envi-
which can be impacted by different shades of lights and ronment and his hands become busy for controlling the tasks
other environmental influences. Some pre-processing steps are for the whole duration of the operation. Therefore, if input
mandatory to the removal of the noise of the background. from eye gaze can be used to controlling task it will make
Approximately, thirty-six distinct gestures can be recognized the overall controlling easy for the operator. A method for
by the system and finally the output is provided as binary tele-operation through Eye Gaze was introduced by Latif et
sequence which is a 7-bit sequence. People who are suffering al. [21] to make the teleoperation robust.
from visual impairment can use this system as their personal The interface developed in this work provides the facility
assistant to write a text document electronically through Office to the operator to continuous monitoring as well as operate
a mobile robot remotely using the eye gazes only. An eval- VI. TANGIBLE INTERFACE
uation based on task was conducted to measure the overall The invention of mouse and keyboard introduced Tangible
performance which shows a satisfactory result. The eye gaze User Interface (TUI). Since the last two decades, TUI has Ex-
based interface provides a Natural User Interface (NUI) which perienced and excellent growth and has provided us with a link
is becoming demanding in the market and going to become between the digital world and physical world. Randelli et. al.
interactive user interface for many devices now a days. An [27] introduces a interface for robot teleoperation integrating
Eye-Tracking sensors as Human-Machine Interfaces in the the Wiimote, joypad and a conventional four-arrows keypad.
aerospace was introduce by Yixiang et. al [22]. this system This interface is used for motion sensing control, controlling
can reliably detect changes in the operator state. However, the linear and angular speed. Using this interface, a remote
a number of environmental factors can affect the sensor’s robot can be operated anyway. But controlling these three
accuracy and precision. A cost effective and with better devices at a time is complicated and needs more practice. As
precision eye tracker is developed by Ho [23]. It can be very Children are more comfortable with tangible interfaces, an In-
helpful for NUI to achieve the next level of an innovative teractive Tangible User Interface for learning and playing with
eye gaze based interface and could be very effective interface music was proposed by Waranusast [28]. The proposed system
for the disable person. The current commercially available translate musical character located on the virtual musical staffs
eye tracker is costly, but this system is very cost effective. on the top surface of table into a melody. It accommodates
Because, in this system an eye camera is used which has creating plain melodies by the manipulations of the blocks
two-850nm wavelength IR emitting LED just very next to the of musical characters. The hardware consists of an infrared
camera lens. This system is very efficient and cost effective. (IR) camera on the top of the table. Through this camera
Yu [24] introduce a Gaze Tracking System for tele-portation which symbol is touched is detected and the Corresponding
for the people who has physical disabilities. As the traditional melody is played through the sound box. But the limitation
interface for tele-operation uses for control interfaces such is that children cannot use their own symbols to create their
as joystick, keyboard, and mouse which are not suitable for own melody. This system uses K-Nearest Neighbor (KNN)
them who have disabled hands. The system has an analog- algorithm and the accuracy of the system is 92%. This system
video-type camera which captures 17 images per second. From can be a good tool for teaching for children. One of the
the images eye is detected first then using gaze estimation important applications of tangible user interface is augmented
algorithm, it creates the mapping co-relation for the direction reality.
of the gaze. Based on the gaze information the remote is The MagicPad, a new type of tangible user interface
controlled. As the system uses camera for eye gaze detection, framework, based on Spatial Augmented Reality(SAR) was
the environment light has impact on accuracy of the system. proposed by Chan. et. al. [29] by using an Infrared pen with
When the teleoperator has a motor disability that cannot access any plain surface, such as a wall surface or paper surface
the traditional input devices. An eye tracking based mobile that receives projected images from a projector, any user can
robot for tele-operation was proposed by Gego [25] which perform a variety of visualization, interactive visualization and
is specially for the people who have motor disabilities and manipulation in the 3D space. Though the literature does not
also can be used by the normal people. In their system they describe this system in detail, but it can be assumed that
have used Microsoft Kinect 3D camera sensor, MagikEye this interface enhance the user experience significantly and
application, a commercial eye tracker application, for eye improves the performance of human machine interaction.
tracking, Kobuki as a robotic plat from. Although this system The system is very cost effective but need more computa-
performance is good but some functionalities like diving the tional power as the system need 3D computation. One of the
robot through curvature line could make the system better. class of human interaction technique is Opportunistic Controls
It is difficult for the people with locomotor disabilities to developed for the augmented reality. Henderson et. al. [30]
move freely without the assistance of the care giver. To proposed a special kind of interface for Augmented Reality
assist them an interface based on eye gaze tracking for Smart which is a tangible interfaces. It takes gestures as the input
wheelchair was developed by Wanluk et. al. [26]. This smart when a surface is touched, recognise teh gesture, and give back
wheelchair mainly consists two modules including image the tangible feedback to the user. They have deployed this
processing for eye tracking, module for wheelchair controlling. techniqe for virtually simulated maintenance inspection of an
The image processing module track the direction of eyeball aircraft engine. A set of virtual buttons emulating the physical
and the wheelchair controlling module control the movement environment also deployed both as Opportunistic Controls and
of the wheelchair. The system has some added functionality adopting simpler passive haptics. So, this system can be used
like sending message through smart phones to the caretaker, in the environment where tangible feedback is necessary.
controlling electrical devices such as turn ON/OFF light. This
system could make the life easier and happier of the disable VII. HYBRID INTERFACES
people whose eyes still are operational. To build the communication between machine and human
in a more natural and robust way multiple interfacing methods
are combined to make a hybrid interface. These interfaces
are very useful for controlling complex machine. Using eye
TABLE I
SUMMARY OF THE INTERFACES FOR ASSISTIVE DEVICES

Interface Intention Capturing De- Effect of En- Applications Target User


Categories vices vironment
Brain Signal Dry electrode, EEG sys- No BCI assistive robots, robotic Stroke patients, Patients of neuro mus-
Based Interface tem, gTec EEG system, ac- wheelchair, Controlling electric cular diseases, teleoperator
tiCAP 2 appliances, Rehabilitation and
Restoration
Gesture Based Monocular Camera, Web Yes Controlling Robotic Arm, Visually impaired people, disable per-
Interface Camera, Kinect Sensor, Infotainment control, Writing son, teleoperator, car driver
MM Wave Radar, DG5 electronic document, electric
Hand Glove wheelchair
Eye Gaze Based Infrared, Camera, Kinect Yes Teleoperation, smart phone and Patients with Locomotor disabilities,
Interface Sensor, home appliances control, virtual teleoperator
reality
Tangible Mouse, Keyboard, Joypad, No Augmented Reality, Learning en- All type of user, Specially Children,
Interface IR Camera, Touch Screen vironment, virtual controller, tan- large machine trouble shooter
Surface gible feedback system.
Hybrid Interfaces Electrodes, Gyroscope, No Wearable robots, exoskeletons Physically disable person, Robot arm
Emotive EPOC sensor, and wheelchair control, teleoper- controller
EEG and EMG system ations

saccade and facial expression a human machine interface was VIII. DISCUSSIONS AND FUTURE DIRECTIONS
introduced by Wang et.al [31] to enhance the functionality
and maneuverability of wearable Robots. This system uses The summary of this study is based on the features of the
only two electrodes which are placed on the top of the both interfaces for the machine which are used to assist human.
ears for the identificaton of the physical signal maninly the Each type of interface has some specialty in some specific
high fidelity. By using both eye saccade and facial expressing application. Some of the interfaces are specially for the disable
with machine learning algorithm the system can achieved over persons whereas few of the are best for critical applications
97% accuracy. This system can be used by the disable person like surgery. The review emphasizes the limitations and omit-
to operate exoskeletons or wheelchairs easily and naturally. As ted properties of the interfaces. Only the most popular and
the gestures are combination of both eye saccade and facial ex- recent developed interfaces for human machine interaction
pression, proper training is necessary before using this system. for assistance have been discussed as it is challenging to
For some devices like disability assistant multimodal human cover all because of the rapid advancement of the technology.
machine interface is very helpful. Bi-modal control interface As most of the interfaces have some limitations, so, the
for electric wheel chair was proposed by Ramirez [32] for the development of interface considering these challenges and
disabled person who are suffering from quadriplegia. They disparities may contribute in the creative research for human
have used head movements combining with facial expression machine interaction in the aid of development of assistant
which was observed by the gyroscope and also brain signal devices.
which was acquired by the Emotive EPOC sensor. In this
system the head movement is used to provide the direction IX. CONCLUSIONS
of the wheelchair (turning left, turning right and stop). Along This paper presents the interfaces used in assistive devices
with the head movements two facial expressions chosen by for human machine interaction in modern technologies. The
the user are used; one for forward and another for execution purpose of the interface is to provide a good medium for the
of turning commands. human so that that can interact with the machine robustly and
more naturally. Although most of the proposed tools are in
A hybrid interface for Robot Arm Control is proposed by their premature stages, some of them have been advanced due
Tang et al. [33] which is based on combining the Electromyo- to the use of recent technologies (ex. Brain signal ). Based
graphy (EMG, brain signal) and Electroencephalogram (EEG) on the review, the features which should be incorporated to
to generate a correct signal for controlling the robotic arm. an interface to make it better are provided. Expectantly, this
The system collects Electromyography(EMG) waves from the study will serve the researcher having enthusiastic mind and
human arm and use the arm movements appropriately and very passionate in developing better, effective and efficient in-
dependably to identify the currently activated joints. terface for human machine(computer) interaction for a specific
purpose or in general.
The deployment of the dual non-homologous signals, divide
the burden of the brain and therefore lower the work pressure R EFERENCES
and provides accuracy more than 95% when it works in offline.
[1] P. Prashant, A. Joshi, and V. Gandhi, “Brain computer interface: A
This system can be used in many other applications like tele- review,” in 2015 5th Nirma University International Conference on
operations as it provides more flexibility to control any device. Engineering (NUiCONE), Nov 2015, pp. 1–6.
[2] L. J. Haasbroek, “Advanced human-computer interfaces and intent [24] M. Yu, X. Wang, Y. Lin, and X. Bai, “Gaze tracking system for
support: a survey and perspective,” in Proceedings of IEEE Systems teleoperation,” in The 26th Chinese Control and Decision Conference
Man and Cybernetics Conference - SMC, vol. 4, Oct 1993, pp. 350–355 (2014 CCDC), May 2014, pp. 4617–4622.
vol.4. [25] D. Gêgo, C. Carreto, and L. Figueiredo, “Teleoperation of a mobile
[3] T. Tezuka, A. Goto, K. Kashiwa, H. Yoshikawa, and R. Kawano, “A robot based on eye-gaze tracking,” 2017 12th Iberian Conference on
study on space interface for teleoperation system,” in Proceedings of Information Systems and Technologies (CISTI), Lisbon, 2017.
1994 3rd IEEE International Workshop on Robot and Human Commu- [26] N. Wanluk, S. Visitsattapongse, A. Juhong, and C. Pintavirooj, “Smart
nication, July 1994, pp. 62–67. wheelchair based on eye tracking,” in 2016 9th Biomedical Engineering
[4] V. I. Pavlovic, R. Sharma, and T. S. Huang, “Visual interpretation International Conference (BMEiCON), Dec 2016, pp. 1–4.
of hand gestures for human-computer interaction: a review,” IEEE [27] G. Randelli, M. Venanzi, and D. Nardi, “Tangible interfaces for robot
Transactions on Pattern Analysis and Machine Intelligence, vol. 19, teleoperation,” in 2011 6th ACM/IEEE International Conference on
no. 7, pp. 677–695, July 1997. Human-Robot Interaction (HRI), March 2011, pp. 231–232.
[5] A. Al-Rahayfeh and M. Faezipour, “Eye tracking and head movement [28] R. Waranusast, A. Bang-ngoen, and J. Thipakorn, “Interactive tangible
detection: A state-of-art survey,” IEEE Journal of Translational Engi- user interface for music learning,” in 2013 28th International Conference
neering in Health and Medicine, vol. 1, pp. 2 100 212 –2 100 212, 2013. on Image and Vision Computing New Zealand (IVCNZ 2013), Nov 2013,
[6] A. O. Pavlovic, N. A. April, and J.-P. V. Belle, “Management issues pp. 400–405.
with cloud computing,” Proc. in ICCC, 2013. [29] L. K. Y. Chan and H. Y. K. Lau, “A tangible user interface using spatial
[7] J. Gehlhaar, “The future of mobile computing,” Proc. in the 20th annual augmented reality,” in 2010 IEEE Symposium on 3D User Interfaces
international conference on Mobile computing and networking,, 2014. (3DUI), March 2010, pp. 137–138.
[8] W. Song, S. Zhao, F. Jiang, K. Zhu, L. Cao, and Y. Shi, “Teleoperation [30] S. Henderson and S. Feiner, “Opportunistic tangible user interfaces for
robot control system based on mindband sensor,” in 2012 Fifth Inter- augmented reality,” IEEE Transactions on Visualization and Computer
national Symposium on Computational Intelligence and Design, vol. 2, Graphics, vol. 16, no. 1, pp. 4–16, Jan 2010.
Oct 2012, pp. 299–302. [31] K. Wang, K. You, F. Chen, Z. Huang, and Z. Mao, “Human-machine
[9] C. Guan, “Brain-computer interface for stroke rehabilitation with clinical interface using eye saccade and facial expression physiological signals to
studies,” in 2013 International Winter Workshop on Brain-Computer improve the maneuverability of wearable robots,” in 2017 International
Symposium on Wearable Robotics and Rehabilitation (WeRob), Nov
Interface (BCI), Feb 2013, pp. 4–5.
2017, pp. 1–2.
[10] K. Kim, T. Carlson, and S. Lee, “Design of a robotic wheelchair with
[32] E. J. Rechy-Ramirez and H. Hu, “Bi-modal human machine interface
a motor imagery based brain-computer interface,” in 2013 International
for controlling an intelligent wheelchair,” in 2013 Fourth International
Winter Workshop on Brain-Computer Interface (BCI), Feb 2013, pp.
Conference on Emerging Security Technologies, Sep. 2013, pp. 66–70.
46–48.
[33] J. Tang, Z. Zhou, and Y. Yu, “A hybrid computer interface for robot
[11] E. Akman Aydin, F. Bay, and Güler, “Region based brain computer arm control,” in 2016 8th International Conference on Information
interface for a home control application,” in 2015 37th Annual Inter- Technology in Medicine and Education (ITME), Dec 2016, pp. 365–
national Conference of the IEEE Engineering in Medicine and Biology 369.
Society (EMBC), Aug 2015, pp. 1075–1078.
[12] K. N. Lavanya, D. R. Shree, B. R. Nischitha, T. Asha, and C. Guru-
raj, “Gesture controlled robot,” in 2017 International Conference on
Electrical, Electronics, Communication, Computer, and Optimization
Techniques (ICEECCOT), Dec 2017, pp. 465–469.
[13] K. A. Smith, C. Csech, D. Murdoch, and G. Shaker, “Gesture recognition
using mm-wave sensor for human-car interface,” IEEE Sensors Letters,
vol. 2, no. 2, pp. 1–4, June 2018.
[14] F. Magoulès and Q. Zou, “A novel contactless human machine interface
based on machine learning,” in 2017 16th International Symposium on
Distributed Computing and Applications to Business, Engineering and
Science (DCABES), Oct 2017, pp. 137–140.
[15] J. L. Raheja, R. Shyam, U. Kumar, and P. B. Prasad, “Real-time
robotic hand control using hand gestures,” in 2010 Second International
Conference on Machine Learning and Computing, Feb 2010, pp. 12–16.
[16] M. Panwar, “Hand gesture based interface for aiding visually impaired,”
in 2012 International Conference on Recent Advances in Computing and
Software Systems, April 2012, pp. 80–85.
[17] C. P. Quintero, R. T. Fomena, A. Shademan, O. Ramirez, and M. Jager-
sand, “Interactive teleoperation interface for semi-autonomous control
of robot arms,” in 2014 Canadian Conference on Computer and Robot
Vision, May 2014, pp. 357–363.
[18] S. Prasad, P. Kumar, and K. P. Sinha, “A wireless dynamic gesture user
interface for hci using hand data glove,” in 2014 Seventh International
Conference on Contemporary Computing (IC3), Aug 2014, pp. 62–67.
[19] A. O. Pavlovic, N. A. April, and J.-P. V. Belle, “Appearance based
recognition of american sign language using gesture segmentation,”
Proc. in ICCC, 2013.
[20] P. Chakraborty, P. Sarawgi, G. A. A. Mehrotra, and R.Pradhan, “Hand
gesture recognition: A comparative study,” In Proc. of the Int. Multi
Conf. of Engineers and Computer Scientists, 2008.
[21] H. O. Latif, N. Sherkat, and A. Lotf, “Telegaze: Teleoperation through
eye gaze,” in 2008 7th IEEE International Conference on Cybernetic
Intelligent Systems, Sep. 2008, pp. 1–6.
[22] Y. Lim, A. Gardi, N. Ezer, T. Kistan, and R. Sabatini, “Eye-tracking sen-
sors for adaptive aerospace human-machine interfaces and interactions,”
in 2018 5th IEEE International Workshop on Metrology for AeroSpace
(MetroAeroSpace), June 2018, pp. 311–316.
[23] H. Ho, “Low cost and better accuracy eye tracker,” in 2014 International
Symposium on Next-Generation Electronics (ISNE), May 2014, pp. 1–2.

View publication stats

You might also like