openscenegraph / OpenSceneGraph

OpenSceneGraph git repository

Home Page:http://www.openscenegraph.org

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Problem with trying to stream an image

minghia opened this issue · comments

commented

I'm trying to write an application that takes a feed of our own encapsulation of a Jpeg frame and renders it on a texture.. I have tried a few things like adding. a DrawCallback and updating the texture prior to rendering the geometry. I also modify the movie streaming example from the osgRecipes repo. The issue I am seeing is every so often the frame appears to go black without any indication of what the error is.

I have also tried to run the decoder thread on a different CPU but to no avail. One of my colleagues wrote a similar program based on the the examples at using at PBO http://www.songho.ca/opengl/gl_pbo.html. Running the source of the stream and the modified PBO program on the same machine on a Nvidia 660 there is no loss of video. Trying to use an image with PBO as in the Cookbook example or one my attempts has image stability. The weird thing is that I set the clear colour to red, but the random chunks of black in the image are still black.

Does anyone have any thoughts?

As we don't have code to know what you've implemented the best we can do is point you at existing examples that do something similar. The OpenSceneGraph/src/osgPlugins/ffmpeg provides an example of an ImageStream being populated by the CPU thread and is where I'd start.

commented

Hi Robert,
I started with this code:
`/* --c++-- OpenSceneGraph Cookbook

  • Chapter 5 Recipe 2
  • Author: Wang Rui
    */

#include <osg/ImageStream>
#include <osg/Geometry>
#include <osg/Geode>
#include <osgDB/ReadFile>
#include <osgViewer/Viewer>

#include "CommonFunctions"

int main( int argc, char** argv )
{
osg::ArgumentParser arguments( &argc, argv );
JPegVideoInput *input = new JPegVideoInput();
int port = 34567;

osg::ref_ptr<osg::Image> image;
image = input->getImage();
input->initialse("224.1.5.3",  port); // This initialises the socket and start the 
                                                          // reader thread

osg::ImageStream* imageStream = dynamic_cast<osg::ImageStream*>( image.get() );
if ( imageStream ) imageStream->play();

osg::ref_ptr<osg::Texture2D> texture = new osg::Texture2D;
texture->setImage( image.get() );

osg::ref_ptr<osg::Drawable> quad = osg::createTexturedQuadGeometry(
    osg::Vec3(), osg::Vec3(1.0f, 0.0f, 0.0f), osg::Vec3(0.0f, 0.0f, 1.0f) );
quad->getOrCreateStateSet()->setTextureAttributeAndModes( 0, texture.get() );

osg::ref_ptr<osg::Geode> geode = new osg::Geode;
geode->addDrawable( quad.get() );

osgViewer::Viewer viewer;
viewer.setSceneData( geode.get() );
return viewer.run();

}`

I replaced the assignment of the image with a call to my class that does the handling of the messages from my socket and use the OSG JPEG ReaderWriter to convert the incoming JPEG to an osg::Image which I use to set the internal copy of the image that is assigned to the image variable in the above code. The image in Jp\PegVideoInput has been created by the following:
`// Declared as
osg::ref_ptrosg::Image _image;

// Initialised by
_image = new osg::Image;
_image->setPixelBufferObject(new osg::PixelBufferObject(_image));`

The image breaking up doesn't happen all the time, but once it happens it happens a lot.

I don't know anything about JPegVideoInput() how it's updating the image is probably where the issues are. One thing to be aware of when running multiple threads for writing to image data and a rendering thread from from the same image data is the data could end up copied in a incomplete state.

This issues is tackled in the ffmpeg plugin by double buffering the image data and doing a pointer swap between the two buffers to make sure that the reading thread and writing thread don't contend on the same block of data at the same time.

commented

It took me a while to understand the double buffering but I think it is now behaving itself. Thanks for your advice.