[mythtv-users] New deinterlacer for perfect image quality when using an interlaced display, mode that matches the source

Paul Gardiner lists at glidos.net
Thu Mar 26 14:12:23 UTC 2009

Tom Lichti wrote:
> On Thu, Mar 26, 2009 at 8:26 AM, Paul Gardiner <lists at glidos.net> wrote:
>> I've just created a patch that provides a deinterlacer for perfect image
>> quality and motion smoothness when using an interlaced display mode that
>> exactly matches the video source. See Ticket #6391. The deinterlacer comes
>> up in the menus as "Interlaced x2". It is also known as field order.
>> Preconditions of use
>> ~~~~~~~~~~~~~~~~~~~~
>> You need a graphics chip that can output interlaced display modes correctly.
> Any idea what those might be? Very interested though.

I think I've worded that condition in a confusing way. All should be
well if you are using TV out.

I've been using the VGA to Scart trick, which requires interlaced TV
timings from VGA. Finding chip/driver combinations that can handle
that is harder. Many ATI chips now work if you use the radeon driver.
And development of the radeon driver is very active. I've had a Radeon
9000 work. I'm currently using an integrated X1250. I've heard that
an integrated HD3200 will work.

I don't know about 1080i. I don't have HD capability at the moment.

There's been a lot of talk of nVidia's drivers not supporting interlaced
output correctly, but I don't know if that applies to just VGA, or
also TV out. 1080i via HDMI might be fine. Don't really know.


More information about the mythtv-users mailing list