[mythtv-users] Myth scaling/deinterlacing vs Faroudja

Louie Ilievski loudawg at comcast.net
Sat Jun 24 18:05:47 UTC 2006


> In a perfect world, yes.
>
> Unfortunately, your video card gets in the way.  There are very few
> video cards that can be fed an interlaced video stream.  When you
> output interlaced, the video card is actually taking  in progressive
> frame, and splitting it into two fields to send the the tv.
>
> I do not know of any "modern" video cards which actually accept an
> interlaced video stream as input.  I heard rumors that Matrox had one,
> at one time.
>
> Now, if I understand things correctly, you can trick the system.
> Myth's BoB deinterlacer sends out data at double-rate --- first the
> "top" field, then the "bottom" field.  So, if you set your modeline to
> 1920x540p (60Hz), most TVs will think they are receiving 1920x1080i
> (30Hz), and deinterlace the two "fields".
>

I was under the impression, after reading countless posts on this stuff, that 
this is the case when using the tv-out port of the video card, such as 
S-Video.  However, I thought that if using VGA or DVI, interlaced output is 
not an issue.


More information about the mythtv-users mailing list