Bateman, JohnCouto-Vale, DanielDanielCouto-Vale2020-03-092020-03-092019-05-06https://media.suub.uni-bremen.de/handle/elib/1599In this thesis, I aimed at recognising a wheelchair user' s intent when making commands to an intelligent wheelchair, relying on what the user meant by the words chosen, the situation the interactants are in, and the ongoing discourse of interaction, making use of only symbolic processing. For this purpose, I created a language-based taxonomy of simple things, locations and processes that could be integrated into a rule-based understanding module, composed of a speech recogniser, a CCG-based text analyser, trackers of states and changes in the environment and four mechanisms to integrate contextual features: a material thing integrator for identifying referents in the surroundings, a figure integrator for ascertaining the participant roles referents should take in described events, a nexus integrator for relating represented events back to the current states in the situation and forward to potential desired states, and a dialogue move integrator for recognising how an utterance moves the dialogue forwards. With this integration mechanism, I achieved 95% task success rate in an evaluation experiment conducted within a simulated apartment and wheelchair viewed from above.eninfo:eu-repo/semantics/openAccessSystemic Functional Linguistics (SFL)Combinatory Categorial Grammar (CCG)embodied assistantintelligent wheelchairhuman adult languagespoken commandsunderstanding410How to make a wheelchair understand spoken commandsWie man einem Rollstuhl beibringt, gesprochene Befehle zu verstehenDissertationurn:nbn:de:gbv:46-00107390-16