Vanc, PetrPetrVancStepanova, KarlaKarlaStepanovaBeßler, DanielDanielBeßler2025-08-202025-08-202023https://media.suub.uni-bremen.de/handle/elib/22288https://doi.org/10.26092/elib/4246This study explores gesture interpretation by utilizing an ontological context. The aim is to store gestures and scene context data in an ontology and use its knowledge graph to actuate the robot arm to perform sets of manipulation tasks used in various environments. The knowledge graph captures the relationships between gestures, objects in the scene, and the desired actions. By putting the ontological context into use, the system can understand the meaning behind the gestures and execute the appropriate actions. The paper focuses on the development of the ontology, including the creation of class properties and the embedding of gestures within the ontology. Additionally, the paper explores how the integration of specifying context interpretation from the ontology may look to enhance the interpretation of gestures. The proposed approach aims to provide more intuitive and adaptive gesture-based supervisory control of robots in general. We tested the proposed ontological system in several tests so that it may be used in our future applications.10enhttps://creativecommons.org/licenses/by/4.0/ontologiesHRIgesture interpretationhand gestures000 Informatik, Informationswissenschaft, allgemeine Werke::000 Informatik, Wissen, SystemeOntological Context for Gesture InterpretationText::Konferenzveröffentlichung::Tagungsband::Konferenzbeitrag10.26092/elib/4246urn:nbn:de:gbv:46-elib222887