[mythtv-users] New deinterlacer for perfect image quality when using an interlaced display, mode that matches the source

Paul Gardiner lists at glidos.net
Thu Mar 26 15:51:38 UTC 2009


Preston Crow wrote:
> I do 1080i with an nVidia 5200FX card.  That's AGP, not PCIe.  The only
> driver that works with this is version 8776, and it only works when
> connecting with DVI (using DVI->HDMI in my case) to a TV that puts out a
> valid EDID 1080i mode.  If you try to specify a 1080i modeline, it won't
> work.  I've recently patched it to work with more recent kernels, but it
> limits me to Xorg-1.3.
> 
> The latest drivers have dropped support for my card, so I haven't been
> able to test them to see if they've fixed the problem, but I haven't
> heard anything encouraging.
> 
> The lack of good video drivers for 1080i is the main thing holding me
> back from upgrading my MythTV system.  I like my native-1080i TV.

That's interesting. What do you do about synchronisation of the
interlaces between source and display? Do you use a deinterlacer,
or does the 5200FX produce only one vsync per frame?

Cheers,
	Paul.



More information about the mythtv-users mailing list