[mythtv-users] New deinterlacer for perfect image quality when using an interlaced display, mode that matches the source

Preston Crow pc-mythtv08a at crowcastle.net
Thu Mar 26 14:47:02 UTC 2009


On Thu, 2009-03-26 at 14:12 +0000, Paul Gardiner wrote:
> >> Preconditions of use
> >> ~~~~~~~~~~~~~~~~~~~~
> >> You need a graphics chip that can output interlaced display modes
> correctly.
> >>
> > 
> > Any idea what those might be? Very interested though.

> There's been a lot of talk of nVidia's drivers not supporting
> interlaced
> output correctly, but I don't know if that applies to just VGA, or
> also TV out. 1080i via HDMI might be fine. Don't really know.

I do 1080i with an nVidia 5200FX card.  That's AGP, not PCIe.  The only
driver that works with this is version 8776, and it only works when
connecting with DVI (using DVI->HDMI in my case) to a TV that puts out a
valid EDID 1080i mode.  If you try to specify a 1080i modeline, it won't
work.  I've recently patched it to work with more recent kernels, but it
limits me to Xorg-1.3.

The latest drivers have dropped support for my card, so I haven't been
able to test them to see if they've fixed the problem, but I haven't
heard anything encouraging.

The lack of good video drivers for 1080i is the main thing holding me
back from upgrading my MythTV system.  I like my native-1080i TV.



More information about the mythtv-users mailing list