[mythtv-users] Enlighten me about 1080 "interlace"

Joe Barnhart joebarnhart at yahoo.com
Tue Dec 21 18:01:12 UTC 2004

--- Doug Larrick <doug at ties.org> wrote:

Thanks for the informative reply.  Now I feel like I'm
getting somewhere.

> There is no "bob and weave" deinterlace in MythTV. 
> The relevant 
> algorithms are "onefield," which behaves as you say
> -- tosses half the 
> fields; and "bob," which shows each field
> sequentially (one field each 
> 1/60 sec.).

Yes, "bob" is what I interpreted as "bob and weave". 
So it displays 540p by unraveling the interlaced
frames and showing each one for 1/60 sec.  That's why
I appear to lose half of the resolution of the set.

> It's worth noting once again that if you're using
> XvMC, none of these 
> algorithms are used.

No XvMC here.  I learned my lesson!

> When displayed in a 540p frame, bob deinterlacing
> should look identical 
> to native 1080i, at least on a CRT with relatively
> long-persistence 
> phosphor.

This would only be true if the frame were displaced
one scan line to "fill in" the resolution instead of
overwriting it.  From what I see, the displacement
does not occur, which makes diagonal lines more jaggy
than they should be.

> I believe that on some sets, native 1080i output
> would look better than 
> the fake 540p that we use.  However, there appears
> to be a bug in 
> nVidia's drivers, where *video* is not displayed at
> the full resolution, 
> but instead undergoes some sort of deinterlacing. 
> Pause some 
> horizontally-panning scene when in 1080i and you
> should see flicker, but 
> you don't.  Normal GUI stuff is fine; it's the Xv
> overlay that's screwed up.

You are exactly correct -- pausing should let me see
the interlaced frames flicker and that does not
happen.  Has anyone mentioned this to nVidia?  Is it
their driver or do you suppose some weakness in
XFree86?  (If this is nVidia's fault, I could walk
over to their building some night and spray-paint
helpful suggestions on their wall.  It's only about a
mile from where I live.) (Just kidding, nVidia.)

> In conclusion, I think that for a 1080i-native set,
> 1920x540 with bob 
> deinterlacing is the best you'll get out of Myth
> right now.

Yes, and while it does look *good* it doesn't quite
measure up to what a hi-def receiver puts out.  And
that's what has me looking to improve things.

Thanks for you cogent answer and your insight.

Do you Yahoo!? 
Yahoo! Mail - Find what you need with new enhanced search.

More information about the mythtv-users mailing list