[mythtv-users] Stupid question - deinterlacing

Michael T. Dean mtdean at thirdcontact.com
Sun Feb 22 14:40:21 UTC 2009


On 02/22/2009 03:58 AM, Paul Gardiner wrote:
> Jean-Yves Avenard wrote:
>> 2009/2/22 Phil Wild philwild:
>>> Perhaps this is a stupid question. I have a 1080p plasma panel and
>>> watching the OTA 1080i signal, the picture looks real good. So to me,
>>> that would infer that the TV has a good quality de-interlacer built
>>> in. I don't know if it is possible, but could mythtv switch the video
>>> signal of the graphics card to match the OTA stream and leave it to
>>> the TV to perform the work in making the picture look as good as it
>>> can on the display?
>> I tried that.
>>
>> I set my Sony Bravia TV to 1080i and fed it 1080i signal via MythTV
>> after turning off all deinterlacer ; result:  it looked crap
>>
>> Like no deinterlacing happened at all.
>>
>> Some people commented that there was an issue with nvidia cards not
>> outputting an interlaced signal properly ...
> There is a lot of evidence that nvidia cards get something wrong
> when outputting an interlaced signal. I've seen it with an
> FX5200 displaying 576i content with XV, although via OpenGL it
> seemed ok (and a different set of problems made it unusable).
>
> However there are other possible causes of bad picture. When you
> use interlaced output - trying to match it to the content being
> played - you need to ensure there is no processing of the picture.
> Scaling will destroy all chances of getting the interlaces to
> the TV intact. So if you've adjusted the picture size because
> of overscan that will mess things up. Also, if the card implements
> some sort of TV deflicker, you need that turned off.
>
> And, even when you've gotten it all perfect, the incoming interlaces
> can get sent in reverse to the TV. That you can cure by pausing and
> restarting repeatedly until you by luck get the correct sync... oh
> and that assumes your equipment has the grunt to never drop a frame.
> Frame dropping will also stop the interlaces gettting to the TV
> synced up.
>
> Certainly for SD it's worth the effort though. You can get
> beautifully crisp images with very little motion blur by using the
> TV's deinterlacer. 

So, use a 1920x540 modeline (that X/NVIDIA drivers think is 
progressive), and it will render properly and send the signal to the TV, 
which it will see as 1080i--assuming the input allows, as mentioned by 
Yeechang.

Mike


More information about the mythtv-users mailing list