[mythtv-users] when to deinterlace?
Cory Papenfuss
papenfuss at juneau.me.vt.edu
Sun Feb 5 14:23:41 UTC 2006
> - What is deinterlacing (in a paragraph or two, and I'm happily referred
> to one of these tomes for more detail)
> --- What's a good way to recognize interlacing artifacts?
> --- Is there any objective way to measure them?
> - Here's when and where you need it w.r.t. MythTV
> --- What kinds of content benefit from it?
> --- Do you need it if you're running a SD CRT technology TV?
> --- How about HD?
> --- How about SD with LCD?
> - Where/how do you turn on and/or tune de-interlacing?
> --- In MythTV?
> --- In your TV out graphics card?
> --- How do these settings interact, if at all?
> --- Does it interact with other settings, like XvMC?
>
> It sure would be *wonderful* if someone who really understands this
> stuff could write it up -- or perhaps at least generate some beginnings
> of such a document on the wiki page. I'm not really moaning about this,
> and it's not really a surprise to me. But if someone who happens to
> know this subject happens to read this and have a little time to put
> into it, I think it would benefit a large percentage of new MythTV
> adopters immensely.
>
I don't have time to wiki it up, but I will say a few things.
- Interlacing is not necessarily bad. It only *becomes* bad when there
are artifacts generated from it. If you are watching the video on an
inherently progressive-scan device (e.g. computer) and you see interlacing
"artifacts" such as the "mouse-teeth," you are NOT seeing artifacts. You
are seeing what interlacing looks like. Many of the "solutions" to
interlacing artifacts on mythtv are due to improper signal
paths/scaling/syncing. The deinterlacing is a band-aid that attempts to
mask these flaws. Also, except in special cases rarely seen on TV
broadcasts (i.e. 3:2 pulldown), deinterlacing is a modification of the
original signal. It is *SUPPOSED* to be there since that's the way it was
taped.
- Much of the misinformation about deinterlacing and scaling has to do
with tvout on video cards. The picture that comes out the s-vid port on a
VGA card is *NOT* the same signal that comes out the VGA/DVI port. It has
been processed through the tvout portion of the card. It has usually been
horizontally scaled (720->800, over/underscanned), vertically-scaled
(480->600, over/underscanned), temporally-scaled (75->59.97), color-scaled
(Vivid, Xv YUV interpretation, etc), and possibly nonlinearly processed
under the title "deinterlacing" among many other things. That processing
is often proprietary, hidden, undocumented, and unchangeable. Even when
it is an openly accessible chip on the card, there's often little you can
do to adjust things. What you see out the s-vid port is only loosely
related to what you see on the VGA screen.
- Much of the current deinterlacing and quality issues with new, non-CRT
TVs is legacy cruft engineered into them. LCD/Plasma/DLPs do not have the
same color gamut, gamma, analog signal, or interlacing requirements of
old-school NTSC CRTs. They have, however, been phonied-up to *look* like
CRTs as far as the input signal is concerned. All of that processing
(same as above) is proprietary, hidden, undocumented, and mostly
unchangeable. Even when you drive them digitally via DVI ports and such,
they generally apply funky processing to them so it is no longer WYSIWYG.
None of these issues are fundamental. They are an artifact of the
consumer-quality crap that we are all using to connect the computey-thingy
to the movie-watchy-thingy. Short of designing our own stuff outright,
we'll all be second-guessing what processing is *actually* going on.
</rantmode off>
-Cory
--
*************************************************************************
* Cory Papenfuss *
* Electrical Engineering candidate Ph.D. graduate student *
* Virginia Polytechnic Institute and State University *
*************************************************************************
More information about the mythtv-users
mailing list