[mythtv-users] What do I have to do to get HD working?

John P Poet jppoet at gmail.com
Sun Apr 16 02:16:24 UTC 2006


On 4/15/06, Mark Lehrer <mark at knm.org> wrote:
> On Sat, Apr 15, 2006 at 06:48:51PM -0600, John P Poet wrote:
>
> > With VLC there is a bit of stutter during the close-up of Agassi about
> > 11-12 seconds into the video clip.  Tried it on my AMD 4200 X2 and the
> > stutter *might* be slightly less noticable on that machine, but I am
> > not willing to bet on it.
>
> Cool, which deinterlacing types did you try?  Was the Bob on VLC the
> same quality as Bob on Myth (and therefore clearly superior to Blend
> on VLC)?

BOB on VLC looked about the same as BOB on Myth, but that does not
mean much.  My display is 1920x1080P(60Hz).  Using BOB means that if
you look close, the image appears to jump up and down.

Because of the jump up/down, the kerneldeint or blends looks better --
in general.  About the only place BOB looks better is with horizontal
scrolling text.  VLC's "blends" look better than Myth's kerneldeint. 
In other words, BOB aside, VLC deinterlacers are superior to Myth's
when viewing your tennis clip.

My old TV was a 1920x1080i(30Hz) display.  With that TV, I drove it
with a 1920x540P(60Hz) modeline and used Myth's (Doug's) BOB
deinterlacer.  That combination looked fantastic because the TV
effectively thought it was being sent a 1080i signal, and it's
built-in deinterlacer kicked in and did an excellent job of stitching
the even/odd lines together.

I admit I wish a better deinterlacer was available for Myth in
combination with fixed-pixel displays.  My Samsung 1080P DLP is
supposed to have a very good deinterlacer built in, but I don't know
how to feed it the original fields such that the TV will take care of
the deinterlacing task -- at least via the VGA input.

nVidia has supposedly fixed their interlaced output in the latest
drivers.  I am unclear, however, what the series of events is.  If I
set X to a 1920x1080i output mode, and tell Myth not to deinterlace,
do the fields in the mpeg file get sent to the TV unaltered, such that
the TV's deinterlacer can "do the right thing"?

If Myth is told not to deinterlace the video, but I am driving my
display at 1080P, then I would expect a "split window" affect on the
TV.  I would expect the image from the first field to be on the top
half of the display, and the image from the second field to show up on
the bottom half of the display.  This does not happen --- why?  Since
it does not happen, I have to assume that the fields are getting
"blended" together somehow, even when deinterlacing is turned off
within Myth...

I use TomsMoComp with Xine.  It "passes" most test DVDs I have, but
still fails the horizontal scrolling text test.  It also fails the
"stadium seating" test -- it fails to compensate for the cadence, but
you could argue that that is not the job of a deinterlacer algorithm.

This whole discussion has switched gears.  We are not really talking
about stuttering anymore, but about deinterlacing algorithms and the
impact on CPU.

John


More information about the mythtv-users mailing list