Multimodal and collaborative interaction for visual data exploration
File | Description | Size | Format | |
---|---|---|---|---|
multimodal-and-collaborative-interaction-for-visual-data-exploration.pdf | Dissertation by Gabriela Molina León | 44.53 MB | Adobe PDF | View/Open |
Authors: | Molina León, Gabriela C. | Supervisor: | Breiter, Andreas Isenberg, Petra |
1. Expert: | Breiter, Andreas | Experts: | Elmqvist, Niklas | Abstract: | As we generate and encounter vast amounts of data every day, the need to support human-data interaction increases. This dissertation investigates how different interaction modalities and devices can support data experts to visually explore and make sense of data, individually and collaboratively. Through a series of empirical studies applying mixed methods, I study how experts interact and wish to interact with spatio-temporal data on tablets and large vertical displays at the workplace. While data exploration and sensemaking usually take place on a desktop computer, there is a diverse range of computing devices that provide novel ways of interacting beyond the standard mouse and keyboard input devices used for WIMP (windows, icons, menus, pointers) interfaces. Therefore, I explore how different interaction modalities, such as touch, speech, pen, and mid-air gestures, can support exploratory and sensemaking tasks on interactive surfaces. The starting point of this dissertation is a visualization design study with social policy researchers to study data-driven work in the context of a real-world scenario. Through this design study, the dissertation contributes the first formal evaluation of co-creation as a methodology for visualization design as well as the data and task abstractions derived from the social policy research domain. After defining the design requirements of the domain experts, the dissertation focuses on the interaction design of visualization systems that support the data-driven tasks. A comparative evaluation on visual data exploration between desktop and tablet-based workplaces revealed that experts apply different interaction strategies to solve similar tasks across devices, making use of different views and interaction techniques. Following up on the single-user scenario, I examine how pairs of experts interact with visualization systems, and with each other, in the context of co-located and synchronous work. The dissertation presents how experts wish to interact on large vertical displays through an elicitation study with touch, pen, speech, and mid-air gestures. The dominance of speech interaction among user preferences leads to an in-depth exploratory study on the role of speech in collaborative work. Despite its challenges, speech interaction benefits awareness and is evenly present in closely coupled and loosely coupled collaboration. Overall, participants prefer interacting unimodally, changing modalities depending on the task and the distance from the display. The dissertation contributes the characterization of interaction patterns and strategies on multimodal visual systems, in individual and collaborative scenarios. Based on the findings on performance, user experience, and interaction choices, the dissertation provides a series of design considerations for multimodal and collaborative systems to support data exploration and sensemaking, together with two systems that enable individuals and small groups to perform such tasks. |
Keywords: | Data Visualization; Multimodal Interaction; Collaborative Work; Interaction Design; Interactive Surfaces; Human-Computer Interaction | Issue Date: | 24-Apr-2024 | Type: | Dissertation | DOI: | 10.26092/elib/2967 | URN: | urn:nbn:de:gbv:46-elib79047 | Institution: | Universität Bremen | Faculty: | Fachbereich 03: Mathematik/Informatik (FB 03) |
Appears in Collections: | Dissertationen |
Page view(s)
249
checked on Nov 13, 2024
Download(s)
102
checked on Nov 13, 2024
Google ScholarTM
Check
This item is licensed under a Creative Commons License