Mania, PatrickPatrickMania2025-10-242025-10-242025-10-14https://media.suub.uni-bremen.de/handle/elib/23092https://doi.org/10.26092/elib/4787Recent advancements in robotics and computer vision have enhanced object recognition and control strategies. However, these developments do not fully tackle the challenges of autonomous manipulation in dynamic, unstructured environments like households. Current systems often rely on specialized algorithms for perception, which lack generalizability and fail to verify the plausibility of their results. This thesis proposes a comprehensive framework that enhances robotic perception and manipulation in dynamic, unstructured environments by integrating a photorealistic, physics-enabled game engine. The core contributions of this research are threefold. First, it presents a unified perception architecture that combines imagistic reasoning, process-level control, and perception task adaptation within a single system. This architecture enables robots to construct internal hypotheses, simulate expected sensor data, and verify perceptual results against rendered scenes, facilitating grounded and introspective perception in real-world tasks. Second, the thesis presents a game-engine-based belief representation, utilizing real-time simulation as an internal model of belief states to enable high-fidelity visual hypothesis generation. The simulated environment represents a dynamic world model, including the robot state, allowing the system to assess the plausibility of perceptual results and predict the visual consequences of actions. Lastly, Perception Pipeline Trees (PPTs) are introduced as a modular process model for adaptive perception execution. PPTs combine hierarchical execution with flexible control flow, supporting reactive switching, concurrent processing, and introspective verification. This model accommodates conventional vision tasks and imagistic reasoning processes within a unified representation. The framework demonstrates effectiveness in real-world applications, including household assistance scenarios where robots perform tasks such as recognizing and manipulating objects, as well as tracking and interacting with humans. By enabling robots to not only observe but also reason about their environment through simulation, this work advances task adaptability, perception accuracy, and reasoning capability, laying the foundation for the next generation of intelligent, imagination-enabled robots.enhttps://creativecommons.org/licenses/by/4.0/RoboticsPerceptionRobot PerceptionGame-Engine-BeliefEmbodied PerceptionPerception FrameworkBehavior TreesUIMHypothesis Verification000 Informatik, Informationswissenschaft, allgemeine WerkePerception for imagination-enabled robotsDissertation10.26092/elib/4787urn:nbn:de:gbv:46-elib230921