: Documentation : Interaction Tutorials

## Introduction

In this tutorial, we will add interactivity based entirely on the spatial relationships of the AR targets.

## Proximity Based Interaction - Target to Camera

As shown in earlier tutorials, accurate tracking information is required in order to display a 3D object on a target. However, in additional to displaying objects, we can use the tracking information as input for controlling the behaviour of the application. One of the simplest ways to achieve this is to monitor the distance of a target from the camera.

The proximity of a target to the camera can be used to create some compelling effects. For example, if the user leans in close to the target, new content can be activated. If they move away from the target then details can be hidden to keep their view uncluttered.

This process has many similarities to level-of-detail techniques common in computer graphics. The basic idea is to render objects at lower detail (and therefore less expensively) when the viewer is far off, and only to render high detail when the viewer is close enough to notice.

The following diagram illustrates how we want the scene to appear. We partition the distance from the camera into ranges, and assign a different scene graph node to each range. A node is only visible when the camera is within the range defined for that node.

### Setup

The basic scene setup process remains the same as for previous tutorials, thus the code has been omitted. Again, we will track one target as follows.

osg::ref_ptr<osg::MatrixTransform> mt = scene->addTrackedTransform("single;data/artoolkit2/patt.hiro;80;0;0");


### Level Of Detail

OSG provides support for level-of-detail rendering through an osg::LOD node. This node allows us to specify which child node will display for a particular viewer distance range.

Next we introduce the LOD node into the scene graph, and add a different model for each distance range.

osg::ref_ptr<osg::LOD> lod = new osg::LOD();