AMiCUS - Bewegungssensor-basiertes Human-Robot Interface zur intuitiven Echtzeit-Steuerung eines Roboterarmes mit Kopfbewegungen
|Other Titles:||AMiCUS - Motion Sensor-based Human-Robot Interface for Intuitive Realtime Control of a Robot Arm Using Head Motion||Authors:||Rudigkeit, Nina||Supervisor:||Gebhard, Marion||1. Expert:||Gräser, Axel||2. Expert:||Lang, Walter||Abstract:||
Within the work presented here, the demonstrator AMiCUS (Adaptive Head Motion Control for User-friendly Support) has been developed. AMiCUS senses the head motions of a tetraplegic user and uses these to control a robot arm in real-time. At first, a MEMS-based Attitude Heading Reference System (AHRS) suitable for head control has been chosen. Next, a solution has been proposed how to attach the AHRS to the head of a potential user to measure his head motion. A control structure has been developed that enables intuitive real-time control of a robot arm in Cartesian space using the 3 input signals provided by head motion. It has been taken into account that the Range of Motion (ROM) of a potential user can be restricted. Therefore, the control paradigm can be adapted to the individual user, using his full available ROM. Moreover, possibilities to generate control commands which can be integrated consistently into the existing control paradigm have been investigated in order to be able to turn the system on and off, and to perform switching operations using solely head motion. On this basis, the first version of the demonstrator, AMiCUS alpha v.1, has been realized. To allow safe and efficient operation, AMiCUS alpha v.1 provides acoustic and visual feedback to the user. Besides that, special attention has been paid to keeping the complexity of the implemented algorithms low to save resources, such as computing and battery power. The strengths and weaknesses of the system have been assessed during a user study with 25 subjects without motion limitations and 6 tetraplegics with severe motion limitations of the head. Both subjective and objective target quantities have been used. It could be shown that AMiCUS alpha v.1 enables smooth, precise and efficient control of a robot arm to perform simple manipulation tasks. AMiCUS could reliably distinguish between direct control signals and switching commands. Furthermore, no situation has been observed which has put the operational safety to a risk. Overall, user satisfaction was high. However, there were also points of criticism. These have been taken into account during the development of the follow-up version of the demonstrator, AMiCUS alpha v2.0. A tetraplegic exemplarily evaluated AMiCUS alpha v2.0 and compared it to its previous version, AMiCUS alpha v1.0. The direct comparison showed that AMiCUS alpha v2.0 was a subjective improvement compared to AMiCUS alpha v1.0. Most remaining problems, such as difficulties during the imagination of gripper rotations, are likely to be solved by means of learning effects. In a semi-realistic scenario the tetraplegic had the final task to pour water from a bottle into a glass. Like the other tasks, she could also solve this task independently. Overall, the results which could be obtained using the demonstrator AMiCUS are promising. Further development into a stand-alone assistive system or into a complement of a semi-autonomous system is conceivable.
|Keywords:||Human-Robot Interface, motion sensors, assistive systems, AHRS, realtime control, head control||Issue Date:||14-Sep-2017||URN:||urn:nbn:de:gbv:46-00106086-17||Institution:||Universität Bremen||Faculty:||FB1 Physik/Elektrotechnik|
|Appears in Collections:||Dissertationen|
checked on Sep 24, 2020
checked on Sep 24, 2020
Items in Media are protected by copyright, with all rights reserved, unless otherwise indicated.