Multimodal sensor data fusion methods for infrastructureless head-worn interfaces - Sensor systems for robust and adaptive human-robot collaboration
|Multimodal_sensor_data_fusion_methods_LW_2022.pdf||22.31 MB||Adobe PDF||View/Open|
|Authors:||Wöhle, Lukas||Supervisor:||Gebhard, Marion||1. Expert:||Gräser, Axel||Experts:||Lang, Walter||Abstract:||
Human-robot collaboration is becoming increasingly important especially in the context of rehabilitation robotics where people use robots to regain autonomy. For this purpose, a variety of approaches to control these systems has been developed. A highly intuitive approach is head-motion based control, which enables precise mapping of 3D control commands onto a system via deliberate head movements.
This thesis presents a system to ensure the necessary robustness and adaptivity for the control of a robotic system by means of head motion. For that purpose, a lightweight, infrastructureless sensor system was developed that can be worn on the head to fully control a robotic system in all degrees of freedom in Cartesian space. The system is modular in design and data fusion scheme to grant as much adaptivity as possible. The core of the sensor system consists of a Magnetic, Angular Rate, and Gravity (MARG) sensors, which are used to determine the orientation of an object in 3D space. The orientation computation is based on the numerical integration of angular rate measurements from a three-axis gyroscope. Unfortunately Micro-Electromechanical Systems (MEMS) gyroscopes are subject to noise terms that degrade the orientation estimation. To counteract this, MARG sensors are equipped with global reference measurement sensors: an accelerometer and magnetometer. The accelerometer is used to correct orientation in the plane perpendicular to gravity, while the magnetometer is used as an electronic compass to correct the remaining axis. This arrangement enables a globally referenced orientation computation. However, magnetometers are subject to interference, which can completely invalidate its use as a reference measurement. To increase robustness against such disturbances, a data fusion process has been developed which compensates short-term disturbances and allows for simple incorporation of additional references for error correction without further effort. On this basis, a novel approach was developed that uses the physiological coupling of a human’s eyes and head rotation to support the MARG sensor’s orientation determination during long-term magnetic field perturbations. Experimental data demonstrates that this method provides an error reduction of up to 50 percent. The usage of an eye tracker logically opens up the use of visual methods for orientation determination. Therefore, within this thesis an open-source visual Simultaneous Localization And Mapping (SLAM) for RGB-D cameras is integrated into the data fusion process to enable a robust calculation of the head pose in space. The data fusion process is designed to dynamically switch between magnetic, inertial, eye tracking-based and visual reference technologies to enable robust orientation estimation under various perturbations, e.g. gyroscope bias, magnetic disturbances and visual sensor data failure. The combination of these sensors and methods provides the capability, in addition to sensing head rotation only, of precise eye or head gaze vector control to perform accurate positioning of a robot’s End Effector (EEF) in Cartesian space.
The work is finalized with a functional verification of the system in a human-robot workplace, which indicates that the sensor system and methods enable a precise control mechanism for robot teleoperation.
|Keywords:||human-robot collaboration; infrastructureless; head-worn interfaces; IMU; MARG sensor; AHRS; realtime control; sensor data fusion||Issue Date:||31-Jan-2022||Type:||Dissertation||DOI:||10.26092/elib/1401||URN:||urn:nbn:de:gbv:46-elib57496||Institution:||Universität Bremen||Faculty:||Fachbereich 01: Physik/Elektrotechnik (FB 01)|
|Appears in Collections:||Dissertationen|
checked on May 28, 2022
checked on May 28, 2022
This item is licensed under a Creative Commons License