EDITalk: Towards designing eyes-free interactions for mobile word processing
Paper i proceeding, 2018

We present EDITalk, a novel voice-based, eyes-free word processing interface. We used a Wizard-of-Oz elicitation study to investigate the viability of eyes-free word processing in the mobile context and to elicit user requirements for such scenarios. Results showed that meta-level operations like highlight and comment, and core operations like insert, delete and replace are desired by users. However, users were challenged by the lack of visual feedback and the cognitive load of remembering text while editing it. We then studied a commercial-grade dictation application and discovered serious limitations that preclude comfortable speak-to-edit interactions. We address these limitations through EDITalk's closed-loop interaction design, enabling eyes-free operation of both meta-level and core word processing operations in the mobile context. Finally, we discuss implications for the design of future mobile, voice-based, eyes-free word processing interface.

Eyes-free interfaces

Barge-in

Eyes-free interaction design

Conversational UI

Voice-based word processing

Författare

Debjyoti Ghosh

Universiti Kebangsaan Singapura (NUS)

Pin Sym Foong

Universiti Kebangsaan Singapura (NUS)

S. D. Zhao

Universiti Kebangsaan Singapura (NUS)

Di Chen

Universiti Kebangsaan Singapura (NUS)

Morten Fjeld

Chalmers, Data- och informationsteknik, Interaktionsdesign (Chalmers)

Conference on Human Factors in Computing Systems - Proceedings

Vol. 2018-April 403

2018 CHI Conference on Human Factors in Computing Systems, CHI 2018
Montreal, Canada,

Ämneskategorier

Interaktionsteknik

Mänsklig interaktion med IKT

Människa-datorinteraktion (interaktionsdesign)

DOI

10.1145/3173574.3173977

Mer information

Senast uppdaterat

2019-01-10