Gaze-Based Control of Robot Arm in Three-Dimensional Space
|Other Titles:||Blickbasierte Steuerung des Roboterarms in Dreidimensionaler Raum||Authors:||Alsharif, Shiva||Supervisor:||Gräser, Axel||1. Expert:||Gräser, Axel||2. Expert:||Walter Anheier||Abstract:||
Eye tracking technology has opened up a new communication channel for people with very restricted body movements. These devices had already been successfully applied as a human computer interface, e.g. for writing a text, or to control different devices like a wheelchair. This thesis proposes a Human Robot Interface (HRI) that enables the user to control a robot arm in 3-Dimensional space using only 2-Dimensional gaze direction and the states of the eyes. The introduced interface provides all required commands to translate, rotate, open or close the gripper with the definition of different control modes. In each mode, different commands are provided and direct gaze direction of the user is applied to generate continuous robot commands. To distinguish between natural inspection eye movements and the eye movements that intent to control the robot arm, dynamic command areas are proposed. The dynamic command areas are defined around the robot gripper and are updated with its movements. To provide a direct interaction of the user, gaze gestures and states of the eyes are used to switch between different control modes. For the purpose of this thesis, two versions of the above-introduced HRI were developed. In the first version of the HRI, only two simple gaze gestures and two states of the eye (closed eyes and eye winking) are used for switching. In the second version, instead of the two simple gestures, four complex gaze gestures were applied and the positions of the dynamic command areas were optimized. The complex gaze gestures enable the user to switch directly from initial mode to the desired control mode. These gestures are flexible and can be generated directly in the robot environments. For the recognition of complex gaze gestures, a novel algorithm based on Dynamic Time Warping (DTW) is proposed. The results of the studies conducted with both HRIs confirmed their feasibility and showed the high potential of the proposed interfaces as hands-free interfaces. Furthermore, the results of subjective and objective measurements showed that the usability of the interface with simple gaze gestures had been improved with the integration of complex gaze gestures and the new positions of the dynamic command areas.
|Keywords:||gaze gesture, eye-gaze interaction, hands-free human-robot interface, 7 degrees of freedom (DOF), robot arm, gaze gesture-based human-robot, HRI, gaze-gesture recognition, dynamic time warping (DTW), inactive-zone gaze gesture recognition, complex gaze gestures, dynamic command area, a solution for Midas touch problem||Issue Date:||1-Mar-2018||URN:||urn:nbn:de:gbv:46-00106451-10||Institution:||Universität Bremen||Faculty:||FB1 Physik/Elektrotechnik|
|Appears in Collections:||Dissertationen|
checked on Sep 22, 2020
checked on Sep 22, 2020
Items in Media are protected by copyright, with all rights reserved, unless otherwise indicated.