Publication:
Gaze-based virtual task predictor

Placeholder

Organizational Units

Program

KU Authors

Co-Authors

Advisor

Publication Date

2014

Language

English

Type

Conference proceeding

Journal Title

Journal ISSN

Volume Title

Abstract

Pen-based systems promise an intuitive and natural interaction paradigm for tablet PCs and stylus-enabled phones. However, typical pen-based interfaces require users to switch modes frequently in order to complete ordinary tasks. Mode switching is usually achieved through hard or soft modifier keys, buttons, and soft-menus. Frequent invocation of these auxiliary mode switching elements goes against the goal of intuitive, fluid, and natural interaction. In this paper, we present a gaze-based virtual task prediction system that has the potential to alleviate dependence on explicit mode switching in pen-based systems. In particular, we show that a range of virtual manipulation commands, that would otherwise require auxiliary mode switching elements, can be issued with an 80% success rate with the aid of users' natural eye gaze behavior during pen-only interaction.

Description

Source:

GazeIn 2014 - Proceedings of the 7th ACM Workshop on Eye Gaze in Intelligent Human Machine Interaction: Eye-Gaze and Multimodality, Co-located with ICMI 2014

Publisher:

Association for Computing Machinery

Keywords:

Subject

Engineering, Electrical electronic engineering, Telecommunications

Citation

Endorsement

Review

Supplemented By

Referenced By

Copy Rights Note

0

Views

0

Downloads

View PlumX Details