[mythtv-users] New deinterlacer for perfect image quality when using an interlaced display, mode that matches the source

Paul Gardiner lists at glidos.net
Fri Mar 27 10:20:32 UTC 2009

Preston Crow wrote:
> On Thu, 2009-03-26 at 15:51 +0000, Paul Gardiner wrote:
>> Preston Crow wrote:
>>> I do 1080i with an nVidia 5200FX card.  That's AGP, not PCIe.  The only
>>> driver that works with this is version 8776, and it only works when
>>> connecting with DVI (using DVI->HDMI in my case) to a TV that puts out a
>>> valid EDID 1080i mode.  If you try to specify a 1080i modeline, it won't
>>> work.  I've recently patched it to work with more recent kernels, but it
>>> limits me to Xorg-1.3.
>>> The latest drivers have dropped support for my card, so I haven't been
>>> able to test them to see if they've fixed the problem, but I haven't
>>> heard anything encouraging.
>>> The lack of good video drivers for 1080i is the main thing holding me
>>> back from upgrading my MythTV system.  I like my native-1080i TV.
>> That's interesting. What do you do about synchronisation of the
>> interlaces between source and display? Do you use a deinterlacer,
>> or does the 5200FX produce only one vsync per frame?
> I don't use a deinterlacer, and it looks fine, so I'm guessing that
> there's only one vsync per frame.  I have an AMD 2500+ Athlon.  I
> normally use XVMC for HD, and that works pretty well.  Without XVMC, I
> can still do HD, but if anything else is running, it starts to hiccup.
> The 5200FX is fanless, but apparently shouldn't be; I've had two burn
> out, so I've bought replacements on eBay.

Interesting. Would be handy if the "one vsync per frame" could be
confirmed, although I can't think how to easily do so. I wonder what
the correct behaviour is, according to xorg. All the combinations
I've tried so far, that I've managed to get interlaced output from,
have given a vysnc per field.


More information about the mythtv-users mailing list