Interactive navigation interface for Virtual Reality using the human body
Journal article, 2014
The use of Virtual Reality (VR) and interactive real-time rendering in urban planning and building design is becoming more and more common. However, the integration of desktop-VR in the urban planning process suffers from complicated navigation interfaces. In particular, people unfamiliar to gaming environments and computers are less prone to interact with a VR visualization using keyboard and mouse as controlling devices. This paper addresses this issue by presenting an implementation of the XBOX 360 Kinect sensor system, which uses the human body to interact with the virtual environment. This type of interaction interface enables a more natural and user-friendly way of interacting with the virtual environment. The validation of the system was conducted with 60 participants using quantitative and qualitative methods. The result showed that participants perceived the interface as non-demanding and easy to use and the interface was perceived better in relation to mouse/keyboard interaction. The implemented interface supported users to switch between different architecture proposals of an urban plan and the switching positively affected learning, understanding and spatial reasoning of the participants. The study also shows that females perceived the system as less demanding than males. Furthermore, the users associated and related their body (human interaction interface) to VR, which could indicate that they used their body during spatial reasoning. This type of spatial reasoning has been argued to enhance the spatial-perception.
Natural User Interface