: Documentation : Interaction Tutorials

Introduction

This tutorial describes how to implement an interface that transitions between augmented and virtual reality, like the famous MagicBook.

In AR applications, the user’s view is anchored to the real world. As the user moves in their environment (e.g. walks down a street) or moves a tracked object (e.g. a marker), the virtual scene elements update to match. In VR, the constraints of physical movement don’t exist, so the user can (potentially) move in any direction, at any speed, for any distance.

Clearly, both AR and VR have their advantages and disadvantages. In AR, the real world provides context, tangibility and can support face-to-face collaboration. VR, on the other hand, can produce environments that aren’t accessible in reality (e.g. microscopic, very remote, abstract). Typically an interface designer would choose the appropriate medium (AR or VR) and provide a single interface.

A transitional interface can provide both AR and VR experiences in the same interface by allowing the user to switch between views at will.

Applications for Transitional Interfaces

The original MagicBook used a transitional interface to enhance childrens’ stories. The reader could turn the pages of a physical book and see each scene pop up in AR, and they could additionally fly down into the scene and experience it from a first-person perspective in VR.

In addition to entertainment there are other scenarios where a transitional interface makes sense:

Architecture

In architecture, it is important to understand a building both as it stands in its surroundings, as well as how it feels from within. Traditionally, scale models have provided a way to grasp the external setting, and VR walkthroughs can be used to simulate being inside the building. Here, a transitional interface could provide access to enhanced scale models in AR, as well as free navigation in VR.

<File:transition_architecture.png>

Components of the Transitional Interface

In this tutorial we will implement a transitional interface, where the user can smoothly fly between an AR view (a scene displayed on a marker) and a VR view (the scene viewed from a first-person perspective). The transition will be a smooth interpolation between these two end-points.

The user is in AR mode

In AR mode, the scene is displayed upon the marker, and the live video view of the real world is displayed. The user controls their viewpoint by moving relative to the marker or by picking up and manipulating the marker directly.

The user is in VR mode

In VR mode, the scene ceases to move with the marker. Instead, the user navigates the virtual environment using traditional VR navigation techniques, such as keyboard and mouse control.

The user is in a transition

During a transition, the user’s viewpoint gradually moves between an external AR view (determined via marker tracking) and an internal VR view (determined by the user’s navigation in the virtual world).

The live video background should fade in when entering the AR view, and fade out when entering a VR view. In VR, the live video is no longer required as the virtual objects are not anchored to the marker any more. Depending on the scene, it may make sense to add virtual surroundings (such as a skybox) in the VR view.

Implementation in osgART