Inertial Motion Capturing : Rigid Body Pose and Posture Estimation with Inertial Sensors
|Other Titles:||Inertial Motion Capturing : Starrkörperposen- und Körperhaltungsschätzung mit Inertialsensoren||Authors:||Wenk, Felix||Supervisor:||Frese, Udo||1. Expert:||Frese, Udo||2. Expert:||Lawo, Michael||Abstract:||
This dissertation is about estimating poses from inertial sensor data, that is estimating orientations and positions. Both poses of single rigid bodies as well as poses of so called skeletons, i.e. systems of jointed rigid bodies, are covered. The key insight into orientation estimation of a single rigid body is to view it as the fusion of sensor data and its dynamics model with prior information. To this end, three different Kalman Filter variations are presented, which fuse the same sensor data and the same dynamics with three different priors. It turns out that the classical model to correct the inclination in an orientation estimator, namely comparing the accelerometer measurement with (negative) gravity, is equivalent to the assumption that the rigid body does not accelerate on long-term average. Assuming that the velocity is zero on long-term average or that the rigid body stays at the same position on long-term average are alternative assumptions and both priors also yield orientation estimators. Moreover, the orientation estimator resulting from the position assumption also estimates a position, which is locally accurate - it follows the accelerometer measurements - but does not drift unboundedly, which it would if the position were obtained by integrating according to the dynamic model only. The focus here is more on the interplay of inertial sensor data and its dynamic model with prior information than it is on practical applications. For instance, for the integrated position to be a usable quantity, the estimate has to be conditioned on the long-term average of the position being zero instead of the velocity or acceleration being zero. In the second, bigger part of this dissertation the posture of a skeleton, i.e. the poses of all the skeleton's bodies, are estimated, again using inertial sensor data only. Notably, no magnetometers are used to recover the rotations around the vertical. Without magnetometers, the rotation of the skeleton as a whole around the vertical, of course, can not be estimated. However, to asses the skeleton's posture, it is also not important. If inertial sensor data of all bodies is fused with the prior information that a skeleton's bodies are jointed using hinges and spherical joints, the relative orientations of the bodies become observable completely: If two accelerometers of two jointed bodies measure the acceleration of a motion, then the relative orientation of those two bodies can be recovered from the directions of the accelerometer measurements, if effects due to movements of the joints are compensated for. The posture estimator that exploits this insight is developed and used in the sensor suit SIRKA, which is workwear with inertial sensors embedded into the clothing. On computationally very limited hardware, which is completely integrated into the suit, the estimator yields posture estimates in real-time. To make this possible, a technique to decouple the sensor's sampling rate from the estimation rate is introduced. Moreover, the sensor orientations and positions inside the suit are almost arbitrary and do not need adjustment. Instead, they are calibrated automatically. The motion capturing workwear is used in a real-world setting, estimating the posture of a worker welding steel on a shipyard. That would not be possible using a motion capturing suit relying on magnetometers.
|Keywords:||Inertial Sensors, Kalman Filter, Sensor Fusion, Motion Capturing, Sensor Suit, Pose Estimation, Posture Estimation||Issue Date:||3-Feb-2017||Type:||Dissertation||URN:||urn:nbn:de:gbv:46-00105731-10||Institution:||Universität Bremen||Faculty:||FB3 Mathematik/Informatik|
|Appears in Collections:||Dissertationen|
checked on Jan 16, 2021
checked on Jan 16, 2021
Items in Media are protected by copyright, with all rights reserved, unless otherwise indicated.