[mythtv-users] new nvidia 352.30 drivers : frame-rate issues
HP-mini
blm-ubunet at slingshot.co.nz
Sat Aug 8 21:24:52 UTC 2015
On Sat, 2015-08-08 at 18:14 +1000, mo.ucina wrote:
> IMO you don't need any interlaced video modes, your video card can
> output (& out perform the TVs) de-interlaced video.
>
<snip>
>
> Thanks for the info . That is very interesting . My issue has been
> resolved by falling back to 352.21 nvidia drivers . All the frame-rates
> now are working well with mythfrontend . And as I change from one file
> to another and from one program to another the correct frame rate is
> chosen in every case . Also I can confirm no visible judder any more .
>
> But this modeline stuff opens another interesting topic of conversation
> . I am using JY Avenard's modeline settings , as he has the exact same
> tv as me . But out there I have seen a couple different sets of
> modelines for the same model . As well as the one suggested here . What
> effect would be observed if I were to change the modeline timing from my
> current one which is :
>
> ModeLine "1920x1080 at 60" 148.500 1920 2008 2052 2200 1080 1084
> 1089 1125 +hsync +vsync
> ModeLine "1920x1080 at 50" 148.500 1920 2448 2492 2640 1080 1084
> 1089 1125 +hsync +vsync
> ModeLine "1920x1080 at 24" 74.250 1920 2558 2602 2750 1080 1084
> 1089 1125 +hsync +vsync
> ModeLine "1920x1080 at 23.976" 74.175 1920 2558 2602 2750 1080
> 1084 1089 1125 +hsync +vsync
>
> to the one above .
>
> Best Regards
> Milorad
>
> _______________________________________________
Those video modes I posted (except the 48Hz?) are included in the
internal mode pool of very video card/driver; you don't need them.
Normally a modeline is calculated to minimize the pixel clock frequency
to reduce EMI/power or another way to maximise resolution for any clock
frequency.
Your old TV might require CRT modelines without the (60Hz) reduced
blanking. A recent Philips model LCD requires this.
It is unfortunate the mythtv makes so much visible judder with
refreshrate framerate mismatches. I use (& submitted ticket & patch) to
reduce (not fix) this issue; with modern TVs this should be irrelevant.
Video modelines define a self clocking signal with that is designed to
allow timing drift/errors etc. The exact tolerance of a DFP scaler to
custom modelines (within reason) is completely a firmware limitation &
bandwidth of the link.
Old analogue displays used PLLs to lock/synchronize so were designed
with specific capture band.
Would need to study the /var/log/Xorg.0.log file to be sure which video
modelines are being accepted/used.
The nvidia video driver changes the acceptable keywords & their meaning
from time to time, or it's a bug.
You can add:
Option "ModeDebug" "True"
line to the xorg.conf with nvidia driver.
More information about the mythtv-users
mailing list