[mythtv-users] Stupid question - deinterlacing

Paul Gardiner lists at glidos.net
Sun Feb 22 17:53:02 UTC 2009


Tom Dexter wrote:
> Yup...anyone with a 1080i native display (like my Hitachi RPCRT) can
> tell you that it's impossible with the Linux nVidia drivers (newer
> than some ancient version...8183 I think) to get correct 1080i output
> of 1080i content without enabling deinterlacing in MythTV.  A common
> symptom on a 1080i display is that it might look great for 10 or 15
> seconds and then drift out of sync causing unspeakable motion blur.

If that's the only symptom then it may be just a synchronisation
problem, in which case it's not specific to nVidia hardware. In my
experience with SD, it seems to be a general Linux/X problem,
apparent with all graphics cards. Every system I've tested has
given the best result with no deinterlacer, but with the price of
sometimes having the interlaces out of sync, and needing to pause
a few times to get them back in sync. Perhaps the situation is
worse with HD because the frequency of dropped frames is higher,
and each dropped frame is a chance for synchronisation to change.


I know there was an nVidia specific problem, but I thought that was
to do with losing one of the interlaces completely and hence
losing resolution.

Paul.



More information about the mythtv-users mailing list