|Full text PDF:||https://pub.uni-bielefeld.de/publication/2900748|
Making information understandable and literally graspable is the main goal of tangible interaction research. By giving digital data physical representations (Tangible User Interface Objects, or TUIOs), they can be used and manipulated like everyday objects with the users’ natural manipulation skills. Such physical interaction is basically of uni-directional kind, directed from the user to the system, limiting the possible interaction patterns. In other words, the system has no means to actively support the physical interaction. Within the frame of tabletop tangible user interfaces, this problem was addressed by the introduction of actuated TUIOs, that are controllable by the system. Within the frame of this thesis, we present the development of our own actuated TUIOs and address multiple interaction concepts we identified as research gaps in literature on actuated Tangible User Interfaces (TUIs). Gestural interaction is a natural means for humans to non-verbally communicate using their hands. TUIs should be able to support gestural interaction, since our hands are already heavily involved in the interaction. This has rarely been investigated in literature. For a tangible social network client application, we investigate two methods for collecting user-defined gestures that our system should be able to interpret for triggering actions. Versatile systems often understand a wide palette of commands. Another approach for triggering actions is the use of menus. We explore the design space of menu metaphors used in TUIs and present our own actuated dial-based approach. Rich interaction modalities may support the understandability of the represented data and make the interaction with them more appealing, but also mean high demands on real-time precessing. We highlight new research directions for integrated feature rich and multi-modal interaction, such as graphical display, sound output, tactile feedback, our actuated menu and automatically maintained relations between actuated TUIOs within a remote collaboration application. We also tackle the introduction of further sophisticated measures for the evaluation of TUIs to provide further evidence to the theories on tangible interaction. We tested our enhanced measures within a comparative study. Since one of the key factors in effective manual interaction is speed, we benchmarked both the human hand’s manipulation speed and compare it with the capabilities of our own implementation of actuated TUIOs and the systems described in literature. After briefly discussing applications that lie beyond the scope of this thesis, we conclude with a collection of design guidelines gathered in the course of this work and integrate them together with our findings into a larger frame.