[mythtv-users] Stupid question - deinterlacing

Paul Gardiner lists at glidos.net
Mon Feb 23 10:01:34 UTC 2009


Michael T. Dean wrote:
> On 02/22/2009 10:51 AM, Paul Gardiner wrote:
>> Michael T. Dean wrote:
>>> So, use a 1920x540 modeline (that X/NVIDIA drivers think is 
>>> progressive), and it will render properly and send the signal to the 
>>> TV, which it will see as 1080i--assuming the input allows, as 
>>> mentioned by Yeechang.
>> Oh wow, I didn't pick that up before. Does that really work? So
>> presumably you use Bob(x2) to split the interlaces from the 1080i
>> source, which will turn the two fields into 1920x1080 frames.
>> Then scaling takes this to a pair of 1920x540 frames, and these
>> get sent one after another, interpreted by the TV as a single
>> 1080i frame. So with this trick you don't even run into the
>> synchronisation problem? Oh hang on, yes you do: the interlaces
>> could become spatially reversed... no worse than
>> other methods though.
>>
>> But I thought with an interlaced signal one interlace starts
>> horizontally half way across the screen... or is that the
>> case only for Standard Definition? Can this trick be used
>> for 576i and 480i? 
> 
> I'm thinking you play back the video without deinterlacing to the 
> half-height progressive display and all works well.  With a full-height 
> display, it would look like it needs deinterlacing, but with the 
> half-height, you just get each field in turn and then your TV deinterlaces.

I don't think it could work like that. If you just play the interlaced
video into a half height display, you'll get scaling which will merge
the interlaces losing resolution and causing motion blur... unless
there's something in the system that recognises the set up and special
cases it.

P.



More information about the mythtv-users mailing list