[mythtv-users] Stupid question - deinterlacing

Tom Dexter digitalaudiorock at gmail.com
Sun Feb 22 18:03:59 UTC 2009


On Sun, Feb 22, 2009 at 12:53 PM, Paul Gardiner <lists at glidos.net> wrote:
> Tom Dexter wrote:
>>
>> Yup...anyone with a 1080i native display (like my Hitachi RPCRT) can
>> tell you that it's impossible with the Linux nVidia drivers (newer
>> than some ancient version...8183 I think) to get correct 1080i output
>> of 1080i content without enabling deinterlacing in MythTV.  A common
>> symptom on a 1080i display is that it might look great for 10 or 15
>> seconds and then drift out of sync causing unspeakable motion blur.
>
> If that's the only symptom then it may be just a synchronisation
> problem, in which case it's not specific to nVidia hardware. In my
> experience with SD, it seems to be a general Linux/X problem,
> apparent with all graphics cards. Every system I've tested has
> given the best result with no deinterlacer, but with the price of
> sometimes having the interlaces out of sync, and needing to pause
> a few times to get them back in sync. Perhaps the situation is
> worse with HD because the frequency of dropped frames is higher,
> and each dropped frame is a chance for synchronisation to change.
>
>
> I know there was an nVidia specific problem, but I thought that was
> to do with losing one of the interlaces completely and hence
> losing resolution.
>
> Paul.
>

Many people with cards old enough to use the old 8183 drivers can
display 1080i perfectly with no software deinterlacing.  Some of them
are running rather outmoded systems for that very reason.  It
certainly seems that nVidia broke something along the way.

Tom


More information about the mythtv-users mailing list