[mythtv-users] Enlighten me about 1080 "interlace"

Brad Templeton brad+myth at templetons.com
Tue Dec 21 06:50:54 UTC 2004


On Tue, Dec 21, 2004 at 03:45:42PM +1100, Phill Edwards wrote:
> > I thought I understood this interlace stuff but Myth
> > is challenging my concept of interlace vs.
> > non-interlace video.
> > 
> > I have a TV capable of displaying 1920x1080
> > interlaced.  It's a CRT based RPTV from Pioneer, if
> > you care.  I have watched terrestrial broadcasts on
> > HDTV and it looks breathtaking.  The image consists of
> > two alternating 540-line frames, each 1/60 second,
> > displayed at even and odd scan lines for 1080 unique
> > lines of information.
> > 
> > Now I get my Myth box running and it looks very, very
> > good.  But not quite as good as my TV with a direct
> > OTA receiver.  Why not?  I've set everything up as
> > carefully as possible.  I'm on stock 0.16 Myth, I have
> > an Intel 3GHz processor running an nVidia 5500 card
> > with Xv.  The modeline is correct and the card is
> > doing interlaced out, as evidenced by the flickering
> > on 1-pixel lines with an xterm on screen.
> 
> I'm no expert on this at all, but I had always assumed the picture
> will never be as good on MythTV as OTA direct to your TV as the video
> has to be encoded and decoded. Inevitable some of the original picture
> quality will be lost in that process, I would have thought. If it's
> _almost_ as good perhaps that's all you can hope for.

No.  In theory there is no reason why Myth's decoder would not be as
good (or better) than the one in the TV.  If you have DVI, the results
should be identical.

In practice, I am not sure myth's decoder is as good (I see more mpegging
artifacts).   And your modeline may never be the exact perfect one the
TV has.   Inside, the TV is just writing directly into its own frame buffer
for the DLP.    In theory, you could get your DVI stream to also just be
fed into that frame buffer, if you did it exactly right.

I'm surprised the TV doesn't figure this out for you with DVI, saying,
"Hmmm, here comes 1920 pixels, why don't I do with that just what I would
do with stuff I decompressed myself."  Perhaps some TVs do.

What would give you identical results would be streaming mpeg out firewire,
or, if an RF modulator was available, streaming out ATSC or QAM to a
TV ready to receive that.   I think it would be cool if the pcHDTV card
came with an ATSC RF modulator on it as well as a demodulator.   Like
the very first VCRs, sending out their signal on channel 3.  But this
time doing _better_ than component video.

Though you would have to write an mpeg streamer X driver, no small feat.


More information about the mythtv-users mailing list