[mythtv] OpenGL vsync issues

Matt Doran matt.doran at papercut.biz
Mon Jun 4 06:43:56 UTC 2007


Hi there,

I recently had an issue where a DVD had very poor playback - jerky 
(almost like it was missing frames) and there was visible tearing.  When 
I changed the video sync from progressive to interlaced the problem 
disappeared.  I originally thought Myth was incorrectly detecting the 
DVD's scan type, but after some experimentation and help of Stanley 
Kamithi I've come to the conclusion that the problem is with OpenGL 
vsync (and possibly it's interaction with the bob-deinterlacer??).

I have a nvidia 6150 and I'm using bob deinterlacing.  I also have all 
of the vblank setting disabled in "nvidia-settings".

Below is a summary of the behavior I see with combinations of settings:

    * bob deint, video scan->progressive, opengl vsync enabled:   jerky
      + tearing
    * bob deint, video scan->interlaced, opengl vsync enabled:      good
    * bob deint, video scan->progressive, opengl vsync disabled:  good
    * bob deint, video scan->interlaced, opengl vsync disabled:     good


Firstly, before I go any further in trying to diagnose/fix this.  Is it 
worth it?   Is OpenGL vsync is the preferred option and does it provide 
better playback/performance.  If I'm not using OpenGL vsync, then I fall 
back to RTC or usleep vsync.   Are there downsides to these fallback 
vsync options?


I spent a few hours on the weekend reading through the code in 
NuppelVideoPlayer.cpp, and in vsync.cpp and also adding some debugging 
to better understand what's going on.   I can't see anything obviously 
wrong.   But given the awful output and obvious tearing, there is 
definitely something wrong.

My current thinking is that the OpenGL vsync works correctly if it only 
has to wait for a single display refresh.  I have a display set to 50Hz 
and my video is 25fps, so when bob is enabled the display is updated 
every refresh.   When the interlacer is disabled (e.g. by setting video 
scan to progressive), the display is only updated every second refresh, 
so the vsync code needs to wait for 2 refreshes.

The relevant code is below (vsync.cpp:715 
<http://cvs.mythtv.org/trac/browser/trunk/mythtv/libs/libmythtv/vsync.cpp#L715>).  
It appears to wait for the next refresh, then if we need to wait further 
(m_delay > 0) calculate the number of refreshes to wait in a single 
"glXWaitVideoSyncSGI" call. 

    // Always sync to the next retrace execpt when we are very late.
    if ((m_delay = CalcDelay()) > -(m_refresh_interval/2)) 
    {
        err = m_imp->glXWaitVideoSyncSGI(2, (frameNum+1)%2 ,&frameNum);
        checkGLSyncError(msg1, err);
        m_delay = CalcDelay();
    }

    // Wait for any remaining retrace intervals in one pass.
    if (m_delay > 0)
    {
        uint n = m_delay / m_refresh_interval + 1;
        err = m_imp->glXWaitVideoSyncSGI((n+1), (frameNum+n)%(n+1), &frameNum);
        checkGLSyncError(msg2, err);
        m_delay = CalcDelay();
    }


The logic seems reasonable, and can't see anything obviously wrong (but 
I don't understand all the intricacies of the video timing).

Does anyone have any suggestions as to why OpenGL vsync is resulting in 
so poor output in these circumstances?   Am I looking in the right 
area?  What else could cause poor playback and tearing only when OpenGL 
vsync is enabled?

This code is a bit tricky to debug, because it's called so often and 
very timing dependent.   It would be interesting to know how the above 2 
code blocks behave in both circumstances.

Should I just give up and use RTC vsync? ;)

Regards,
Matt


More information about the mythtv-dev mailing list