This demo video presents some features of the interactive virtual reality (VR) visualization we are currently developing at the Department of Media Technology at Linnaeus University (January 2017).
The application allows exploration of open data in an immersive VR environment, and is able to handle different kinds of (open) data, as long as this data is mapped to a specific data model we defined. In this example, data about the US presidential election 2016 is displayed. The VR application was developed using Unity. A companion application was developed using openFrameworks, presenting an overview of the data displayed in the VR environment. Both applications are connected using the OSC (Open Sound Control) protocol. The companion application updates live the position and field of view of the VR user. A user outside of the VR environment can highlight nodes, which are highlighted also in the VR environment, facilitating communication between the users. This demo video illustrates room-scale VR using the HTC Vive. Two additional interaction modes have been implemented: Oculus Rift using a gamepad, and Oculus Rift with vision-based motion controls using the Leap Motion. A Node.js server and MongoDB database provide data and visual structures to both the Unity and openFrameworks application.
The developed prototype enables us to
1) explore the interaction with open data in a virtual reality environment.
2) explore the interaction between two users, while one is immersed in a virtual reality environment and the other is not.
3) compare different interaction modes (Gamepad, vision- based motion controls, room-scale VR) with a unified user interface design.
*Update 2019-01-23: A journal article comparing the three input technologies has been published as OpenAccess in Springer's "Virtual Reality": https://doi.org/10.1007/s10055-019-00378-w
[VRxAR Labs]
Aris Alissandrakis
Nico Reski
[web]
vrxar.lnu.se
Linnaeus University
Växjö, Sweden
January 2017