Kenghagho Kenfack, FranklinFranklinKenghagho Kenfack2025-10-242025-10-242025-10-07https://media.suub.uni-bremen.de/handle/elib/23082https://doi.org/10.26092/elib/4781This thesis argues that bottom-up theories of perception suffer from the high semantic entropy arising from the severe spatial, temporal, and informational limitations of sensory input during everyday manipulation tasks. However, by emulating the “dark matter” of perception — including intent, functionality, utility, causality, and physis — and integrating it with sparse sensory data, robotic perception can achieve a causal, transparent, and computationally efficient ability to anticipate and explain relevant events in such tasks. To this end, the thesis introduces Probabilistic Embodied Scene Grammars (PESGs) to formalize this perceptual “dark matter.” It also presents a generator and a parser to respectively anticipate and explain event-centric scenes. The approach is demonstrated in complex real-world scenarios, including household tasks such as pancake making in kitchen environments, shopping tasks in supermarkets, and sterility testing tasks in medical laboratories.enhttps://creativecommons.org/licenses/by/4.0/ProspectionRobot PerceptionCognitive EmulationPerceptual Dark MatterRobot Manipulation000 Informatik, Informationswissenschaft, allgemeine Werke::000 Informatik, Wissen, SystemeProspective perception through cognitive emulation for robot manipulation tasks: "Perceiving like humans do"Dissertation10.26092/elib/4781urn:nbn:de:gbv:46-elib230821