[mythtv-users] judder problems with nvidia 358.16

Alex Halovanic halovanic at gmail.com
Wed Aug 16 00:08:03 UTC 2017


Hi Mark,

I should clarify that I did *not* change the algorithm that MythTV is using
to try to select the best refresh rate for a video's FPS when you use the
Auto setting.  I only changed the lookup of what "Width + Height + Refresh"
combinations are *available*. You can see these choices when selecting
choices other than Auto in the menu.  As part of that I also necessarily
had to change how one switches to a given modeLine (whether chosen by hand
or automatically).

The reason I did not have to change the Auto refresh rate algorithm is that
it already took into account more precise decimal values.  The actual
difficulty was that the most straightforward XRandr API implementation for
looking up and setting refresh rates only used integers (for that matter,
so does the OS X one, I believe). This resulted in:

   - Not having a complete list of rates available (59.94 and 60.00 both
   appear rounded out as 60).
      - This was all non-NVIDIA drivers.
      - Or having fake (but distinct) values that involve quite a bit of
   guesswork to figure out what they really were, both by MythTV, and by users
   in the Settings screen
   - This was NVIDIA, but the guesswork MythTV was able to use broke
      somewhere along the way.

For those interested in what the actual "best rate" algorithm that I did
not tocuh does, it mostly involves looking for an available refresh rate
(for a given screen size) that seems "close enough". This is likely because
Frames Per Second is not a precise measure and fluctuates a bit. It does
prioritize a refresh rate that is double the frame rate for videos in the
24.5 -> 30.5 FPS ranges.  This is because an interlaced video will come out
to something like ~29.97 FPS, but needs a refresh rate of 59.94 to display
alternating frames, whether that is 59.94i or 59.94p. The progressive
option is usually the better choice since the interlaced mode will only
work if the resolution is also identical AND the TV has a good
deinterlacer.  But, near as I can tell, MythTV does not currently make any
comparisons between interlaced and non-interlaced modes: there are no flags
passed around indicating that the mode is interlaced, and for that matter,
no indication that the video itself was, beyond some guesses around those
FPS ranges.

It's quite likely there is some room for improvement in the auto-matching
algorithm as well. I didn't want to bite off too much on this attempt.  For
those who want a closer look, it is implemented here
<https://github.com/MythTV/mythtv/blob/master/mythtv/libs/libmythui/DisplayResScreen.cpp#L119>
.

Thanks,
Alex
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mythtv.org/pipermail/mythtv-users/attachments/20170815/c9b459fe/attachment.html>


More information about the mythtv-users mailing list