[mythtv-users] HDTV Output/Deinterlacing

Florin Andrei florin at andrei.myip.org
Wed Dec 15 22:11:37 UTC 2004


On Tue, 2004-12-14 at 22:37 -0500, Dave wrote:

>    I've seen a lot of posts from poeple telling others to turn on the
> deinterlacer, or that playback is better with the deinterlacer on. 
> Given that the modelines I've seen are set to output 720p or 1080i,
> what are we deinterlacing for?
>    Assuming that the TV supports inputs of 720p and 1080i, can't the
> original format just be passed straight to the TV?

Being myself a newcomer to HDTV, i was puzzled too by this issue. I even
asked similar questions to the list, but nothing came back.
After some digging, reading documentation, asking on HDTV forums, i
think i started to "see the light".

Although most HDTV sets do support many various HDTV modes on their
inputs, typically they only have one native format that is actually used
to drive the display. All the rest are converted internally.
Yes, i know it sucks.

Usually, CRT-based sets (including rear projection CRT) are working in
1080i natively.
Pixel-based sets (plasma, LCD, DLP), whether they use projection or not,
typically work in 720p natively (at the screen level).
Some very rare and expensive CRTs support both 1080i and 720p natively
(it's unlikely that yours is one of them).
There are exceptions, but that's what the majority does.

So, yeah, the set seems to accept many different formats, but typically
it will convert all of them to just one format (typically either 1080i
or 720p), and always use that format to drive the actual display.

See here for more info:

http://www.hdtvoice.com/voice/forumdisplay.php?f=384

Also read the other forums on the same site, when you have time (very
competent community):

http://www.hdtvoice.com/voice/index.php

Therefore, it seems to me that the most logical decision is to configure
your NVidia/ATI card to use that mode which is natively supported by
your HDTV screen. You have to triple-check, verify with the vendor, etc.
so that you know precisely what the native mode really is on your HD
screen (it can be deceiving). Then set the mode on your NVidia/ATI card
and forget about it.

Not using your HDTV set's native mode on the MythTV output might lead to
situations when the image is converted twice (the TV stations sends the
signal using your TV display native mode, MythTV converts it to the
other mode, then the HDTV set converts it back to the native mode) which
might degrade the quality quite a bit.
If you do use the native mode, then worst case there's one conversion on
the MythTV box; best case there's none.

If you don't own an HDTV set, it seems to me like it's simpler to buy a
pixel-based one, then configure your video card with 720p, deinterlace
on your MythTV box (if necessary) and just send everything to the HDTV
set as 720p over the DVI link.
Using 1080i on the MythTV output might work, but it seems like it's a
less used path (deinterlaced modes are rather unusual for computers and
i've heard about all kinds of snags related to drivers, etc.), so i'd
rather stay away from that.
Well, if you already own a CRT set, then i guess you have to try and use
1080i.

Since i don't own (yet) an HD set, the above considerations led me to
decide to ignore CRT-based screens and just purchase something that's
pixel-based.
Probably not plasma (too expensive, plus it's vulnerable to burn-in) but
perhaps LCD or DLP (cheaper, not vulnerable to burn-in at all... well,
LCD is, if tortured in a very nasty way, like - display the same static
image for months, 24/7).
In addition, CRT screens are prone to burn-in, so that's one more reason
to avoid them.

-- 
Florin Andrei

http://florin.myip.org/



More information about the mythtv-users mailing list