Adding Live Video Stream

To begin our advanced scene setup, we must create an osgViewer and add various event handlers, etc. as always. In addition we require a root node for our application, this will be the root of the entire scene graph.

osg::ref_ptr<osg::Group> root = new osg::Group;

To add live video to the basic OSG viewer, we first need to load and configure a video plugin, and open the video stream.

In this tutorial we use the ARToolKit video capture plugin. The first step is to preload the plugin (this finds and loads the dynamic library), and gives us a video ID. We can then use this video ID to get an instance of the video plugin.

Once we have a valid instance of the video plugin, we can open the video. This will not yet start the video stream but will get information about the format of the video which is essential for the connected tracker.

The Video Stream

int video_id = osgART::PluginManager::instance()->load("osgart_video_artoolkit2");

osg::ref_ptr<osgART::Video> video =
    dynamic_cast<osgART::Video*>(osgART::PluginManager::instance()->get(videoI_id));

if (!video.valid()) {

        // Without video an AR application can not work. Quit if none found.
        osg::notify(osg::FATAL) << "Could not initialize video plugin!" << std::endl;
1. Add Video Stream to Scene Graph
        exit(-1);
}
video->open();

if (osg::ImageStream* imagestream = dynamic_cast<osg::ImageStream*>(video.get())) {
        osgART::addEventCallback(root.get(), new osgART::ImageStreamCallback(imagestream));

}

We now have the live video stream, but need some way to display it within the viewer. The support function createImageBackground takes our video stream and returns a video background node which we can add to the scene graph, as a child of the root node.

In augmented reality applications the video stream should always appear behind everything else - this is achieved by setting the video background node Render Bin to zero.

Adding the Video Stream to the Scene Graph

osg::ref_ptr<osg::Group> videoBackground = createImageBackground(video.get());

videoBackground->getOrCreateStateSet()->setRenderBinDetails(0, "RenderBin");

root->addChild(videoBackground.get());

” createImageBackground”

This creates the graphical objects necessary to display the video stream in the viewer. osgART provides the classes VideoLayer and VideoGeode for this.

A VideoLayer is a node that sets up the correct rendering state for displaying the video. The VideoLayer sets up an orthographic projection, disables lighting and disables depth testing.

A VideoGeode sits beneath the VideoLayer and contains the actual textured geometry that is displayed onscreen. Given an osgART::Video, the VideoGeode constructs the geometry (possibly including a mesh to undistort the video), creates and attaches a texture, and sets up the appropriate rendering states.

This method a node containing the video graphics, which can be added directly to the scene graph.

osg::Group* createImageBackground(osg::Image* video, bool useTextureRectangle = false) {

    osgART::VideoLayer* layer = new osgART::VideoLayer();

    osgART::VideoGeode* geode =
    new osgART::VideoGeode(video, NULL, 1, 1, 20, 20, useTextureRectangle ?

    osgART::VideoGeode::USE_TEXTURE_RECTANGLE : osgART::VideoGeode::USE_TEXTURE_2D);

    layer->addChild(geode);

    return layer;
}

Finally …

Start the video stream, then start the simulation loop, then close the video stream once the loop is finished.

video->start();
int r = viewer.run();
video->close();
return r;