EDITalk: Towards designing eyes-free interactions for mobile word processing
Paper in proceeding, 2018

We present EDITalk, a novel voice-based, eyes-free word processing interface. We used a Wizard-of-Oz elicitation study to investigate the viability of eyes-free word processing in the mobile context and to elicit user requirements for such scenarios. Results showed that meta-level operations like highlight and comment, and core operations like insert, delete and replace are desired by users. However, users were challenged by the lack of visual feedback and the cognitive load of remembering text while editing it. We then studied a commercial-grade dictation application and discovered serious limitations that preclude comfortable speak-to-edit interactions. We address these limitations through EDITalk's closed-loop interaction design, enabling eyes-free operation of both meta-level and core word processing operations in the mobile context. Finally, we discuss implications for the design of future mobile, voice-based, eyes-free word processing interface.

Eyes-free interfaces

Barge-in

Eyes-free interaction design

Conversational UI

Voice-based word processing

Author

Debjyoti Ghosh

National University of Singapore (NUS)

Pin Sym Foong

National University of Singapore (NUS)

S. D. Zhao

National University of Singapore (NUS)

Di Chen

National University of Singapore (NUS)

Morten Fjeld

Chalmers, Computer Science and Engineering (Chalmers), Interaction design

Conference on Human Factors in Computing Systems - Proceedings

Vol. 2018-April 403
978-145035621-3 (ISBN)

2018 CHI Conference on Human Factors in Computing Systems, CHI 2018
Montreal, Canada,

Subject Categories

Interaction Technologies

Human Aspects of ICT

Human Computer Interaction

DOI

10.1145/3173574.3173977

More information

Latest update

1/10/2019