[mythtv-users] Myth scaling/deinterlacing vs Faroudja

John P Poet jppoet at gmail.com
Sat Jun 24 17:36:50 UTC 2006


On 6/24/06, Louie Ilievski <loudawg at comcast.net> wrote:
> > My LVM-37W1 (and unlike the LVM-37W3) has the same Genesis Faroudja
> > descaler that the LVM-42W2 does. Get thee hence to
> > <URL:http://www.gossamer-threads.com/lists/mythtv/users/203862#203862>
> > for details on my setup. In my experience, 59.94p+Bob+'OpenGL vertical
> > sync for timing' is slightly, but noticeably, better than
> > 59.94i+Kernel+No OpenGL.
>
> Wow, great write up.  I missed that post while searching the archives.  Out of
> curiousity, why would you use Kernel deint, or any deint for that matter,
> when outputting at 59.94i?  Isn't the idea to let the TV take care of things
> in that mode?

In a perfect world, yes.

Unfortunately, your video card gets in the way.  There are very few
video cards that can be fed an interlaced video stream.  When you
output interlaced, the video card is actually taking  in progressive
frame, and splitting it into two fields to send the the tv.

I do not know of any "modern" video cards which actually accept an
interlaced video stream as input.  I heard rumors that Matrox had one,
at one time.

Now, if I understand things correctly, you can trick the system.
Myth's BoB deinterlacer sends out data at double-rate --- first the
"top" field, then the "bottom" field.  So, if you set your modeline to
1920x540p (60Hz), most TVs will think they are receiving 1920x1080i
(30Hz), and deinterlace the two "fields".

John


More information about the mythtv-users mailing list