[mythtv-users] Enlighten me about 1080 "interlace"

Joe Barnhart joebarnhart at yahoo.com
Mon Dec 20 18:17:02 UTC 2004

I thought I understood this interlace stuff but Myth
is challenging my concept of interlace vs.
non-interlace video.

I have a TV capable of displaying 1920x1080
interlaced.  It's a CRT based RPTV from Pioneer, if
you care.  I have watched terrestrial broadcasts on
HDTV and it looks breathtaking.  The image consists of
two alternating 540-line frames, each 1/60 second,
displayed at even and odd scan lines for 1080 unique
lines of information.

Now I get my Myth box running and it looks very, very
good.  But not quite as good as my TV with a direct
OTA receiver.  Why not?  I've set everything up as
carefully as possible.  I'm on stock 0.16 Myth, I have
an Intel 3GHz processor running an nVidia 5500 card
with Xv.  The modeline is correct and the card is
doing interlaced out, as evidenced by the flickering
on 1-pixel lines with an xterm on screen.

Some people on the list whose advice I trust very much
have suggested that 540p and 1080i output are
_identical_ as played by Myth.  This certainly seems
to be true.  I have played with the "deinterlace"
setting and the "bob and weave" deinterlace (which I
understand duplicates one of the frames and tosses the
other) looks the same as interlaced on my set.

Is this an inherent limitation of Myth?  Is there no
way to recover and show both frames of the 1080i
signal on Myth, or am I missing a key setting?  It
bothers me that my set could be showing me twice the
picture I'm getting now!

Do you Yahoo!? 
Yahoo! Mail - now with 250MB free storage. Learn more.

More information about the mythtv-users mailing list