[mythtv-users] Myth scaling/deinterlacing vs Faroudja

Yeechang Lee ylee at pobox.com
Sat Jun 24 21:29:26 UTC 2006


Louie Ilievski <loudawg at comcast.net> says:
> Out of curiousity, why would you use Kernel deint, or any deint for
> that matter, when outputting at 59.94i?  Isn't the idea to let the
> TV take care of things in that mode?

In theory, yes, but I found that Kernel+No OpenGL produced the best
video output under 59.94i after the one I actually use (under 59.94p).

Now here comes a small, but potentially significant, caveat that I
should've included originally. My MythTV box's DVI cable is plugged
into DVI1 on the panel. I recall reading someplace that DVI1 on the
LVM-37W1 is designed to not do any deinterlacing on its video output,
whether it's fed interlaced video or not. It's possible I've thus not
actually been putting the panel's deinterlacing circuitry to the test.

Since the couch I'm sitting on is far too comfortable and I don't feel
like getting up to rearrange my setup, I hereby dub thee the Duke of
Deinterlacing and bide thee to, when it arrives, try the various DVI
inputs on the LVM-42W2 out and see if there are any differences.

PS - Do you actually have some true-HD video content to test the panel
with? Have you already equipped yourself with ATSC cards or HD cable
boxes?

-- 
Yeechang Lee <ylee at pobox.com> | +1 650 776 7763 | San Francisco CA US


More information about the mythtv-users mailing list