[mythtv-users] Stupid question - deinterlacing

Paul Gardiner lists at glidos.net
Sun Feb 22 15:51:24 UTC 2009

Michael T. Dean wrote:
> So, use a 1920x540 modeline (that X/NVIDIA drivers think is 
> progressive), and it will render properly and send the signal to the TV, 
> which it will see as 1080i--assuming the input allows, as mentioned by 
> Yeechang.

Oh wow, I didn't pick that up before. Does that really work? So
presumably you use Bob(x2) to split the interlaces from the 1080i
source, which will turn the two fields into 1920x1080 frames.
Then scaling takes this to a pair of 1920x540 frames, and these
get sent one after another, interpreted by the TV as a single
1080i frame. So with this trick you don't even run into the
synchronisation problem? Oh hang on, yes you do: the interlaces
could become spatially reversed... no worse than
other methods though.

But I thought with an interlaced signal one interlace starts
horizontally half way across the screen... or is that the
case only for Standard Definition? Can this trick be used
for 576i and 480i?


More information about the mythtv-users mailing list