Ontological Context for Gesture Interpretation
Veröffentlichungsdatum
2023
Autoren
Zusammenfassung
This study explores gesture interpretation by utilizing an ontological context. The aim is to store gestures and scene context data in an ontology and use its knowledge graph to actuate the robot arm to perform sets of manipulation tasks used in various environments. The knowledge graph captures the relationships between gestures, objects in the scene, and the desired actions. By putting the ontological context into use, the system can understand the meaning behind the gestures and execute the appropriate actions. The paper focuses on the development of the ontology, including the creation of class properties and the embedding of gestures within the ontology. Additionally, the paper explores how the integration of specifying context interpretation from the ontology may look to enhance the interpretation of gestures. The proposed approach aims to provide more intuitive and adaptive gesture-based supervisory control of robots in general. We tested the proposed ontological system in several tests so that it may be used in our future applications.
Schlagwörter
ontologies
;
HRI
;
gesture interpretation
;
hand gestures
Verlag
RWTH Aachen
Institution
Fachbereich
Institute
Dokumenttyp
Konferenzbeitrag
Zeitschrift/Sammelwerk
RobOntics 2023 = CEUR Workshop Proceedings, Band 3595
Seitenzahl
10
Zweitveröffentlichung
Ja
Dokumentversion
Published Version
Sprache
Englisch
Dateien![Vorschaubild]()
Lade...
Name
Vanc_Stepanova_Beßler_Ontological Context for Gesture Interpretation_2023_published-version.pdf
Size
4.51 MB
Format
Adobe PDF
Checksum
(MD5):328cdef9dab8913ffc2531152b522c2d
