Levitation with localised tactile and audio feedback for mid-air interactions (Levitate)
Research Project, 2017 – 2020

This project will be the first to create, prototype and evaluate a radically new human-computer interaction paradigm that empowers the unadorned user to reach into levitating matter, see it, feel it, manipulate it and hear it. Our users can interact with the system in a walk-up-and-use manner without any user instrumentation.

As we are moving away from keyboards and mice to touch and touchless interactions, ironically, the main limit is the lack of any physicality and co-located feedback. In this project, we propose a highly novel vision of bringing the physical interface to the user in mid-air. In our vision, the computer can control the existence, form, and appearance of complex levitating objects composed of "levitating atoms". Users can reach into the levitating matter, feel it, manipulate it, and hear how they deform it with all feedback originating from the levitating object's position in mid-air, as it would with objects in real life. This will completely change how people use technology as it will be the first time that they can interact with technology in the same way they would with real objects in their natural environment.

We will draw on our understanding of acoustics to implement all of the components in a radically new approach. In particular, we will draw on ultrasound beam-forming and manipulation techniques to create acoustic forces that can levitate particles and to provide directional audio cues. By using a phased array of ultrasound transducers, the team will create levitating objects that can be individually controlled and at the same time create tactile feedback when the user manipulates these levitating objects. We will then demonstrate that the levitating atoms can each become sound sources through the use of parametric audio with our ultrasound array serving as the carrier of the audible sound. We will visually project onto the objects to create a rich multimodal display floating in space.

Participants

Jens Ahrens (contact)

Chalmers, Architecture and Civil Engineering, Applied Acoustics

Carl Andersson

Chalmers, Architecture and Civil Engineering, Applied Acoustics

Collaborations

Aarhus University

Aarhus, Denmark

Ultrahaptics Limited

Bristol, United Kingdom

University of Bayreuth

Bayreuth, Germany

University of Glasgow

Glasgow, United Kingdom

University of Sussex

Brighton, United Kingdom

Funding

European Commission (EC)

Project ID: EC/H2020/737087
Funding Chalmers participation during 2017–2020

Related Areas of Advance and Infrastructure

Information and Communication Technology

Areas of Advance

Innovation and entrepreneurship

Driving Forces

Publications

More information

Latest update

9/2/2020 1