[mythtv-users] DeInterlacing && DVI

Paul Gardiner lists at glidos.net
Fri Aug 7 17:08:58 UTC 2009


Jarod Wilson wrote:
> On Aug 7, 2009, at 8:49 AM, Tom Dexter wrote:
> 
>> On Thu, Aug 6, 2009 at 1:26 PM, Goga<goga777 at front.ru> wrote:
>>>>> It is worth mentioning however (for anyone who hasn't followed the
>>>>> many discussions of the topic) that the above (sending
>>>>> non-deinterlaced video to an interlaced display) simply does not work
>>>>> correctly with any modern nVidia drivers and causes severe tearing.
>>>>
>>>> I think someone recently made a "Interlaced" deinterlacer for mythtv,
>>>> that ironically doesn't deinterlace but makes sure things are in order
>>>> so that it will display properly on interlaced display. I think there
>>>> were some caveats like no scaling, etc. I believe that VDPAU still
>>>> looked better for many people, but for those that have high quality
>>>> deinterlacers in their TVs, sending things interlaced was better.
>>>
>>> but how to tune Nvidia Geforce 8400 card and xorg to send to TV 
>>> proper 1080i signal?
>>> I couldn't tune proper 1080i, that's why I'm using 1080p with bob 
>>> vdpau-deinterlaicer
>>>
>>> Goga
>>
>> With my GeForce 7100GS, I simply have the "Modes" in my "Screen"
>> section set to use the built in "1920x1080" mode and it works fine.
>> I'm using the 180.60 driver, but there have been countless drivers
>> since I needed to use a custom mode line.
> 
> You need a mode specified at all? :)
> 
> /me has no mode specified at all, it Just Works at 1920x1080p...

I think the question was about 1080i



More information about the mythtv-users mailing list