[mythtv-users] interlaced vs not?

Bruce Markey bjm at lvcm.com
Sat May 1 16:18:03 EDT 2004


Chris Petersen wrote:
>>Wow.  They sure filled you full of it at Computer City.
> 
> 
> Apparently so....  Although I had just interpreted the monitor-type
> interlacing as the non-computer type:
> 
> interlaced: 
>      1. To connect by or as if by lacing together; interweave.
>      2. To intersperse; intermix: interlaced the testimony with
>         half-truths.

Interesting. So the Computer City explanation you were given
was 'interlaced' with half-truths ;-).

> But it still doesn't help me figure out if my source is interlaced or
> not...  

That's easy. NTSC || PAL == interlaced. For HDTV NNNi == interlaced,
NNNp == progressive scan (non-interlaced).

All television signals in the NTSC format are, by definition,
interlaced. This means that all analog NTSC signals that reach
your capture card are a specific arrangement of timed sync pulses
with half of the scan lines of the raster in a top field and the
other half in the bottom field. If the camera records the two
fields at a different point in time (BTW, not always the case),
and there is motion, when the two fields are combined into a
single frame, the edges of objects in motion will be in two
different places on alternating scan lines.

The only way the moving picture will look normal is if the
fields are replayed at the exact same sync rate and format with
each scan line on the exact same scan line as the camera recorded.

This will not be the case with an Xv display window inside an X
display that is underscaned inside a tv-out signal to an NTSC
device. The biggest part of the problem is that if you set your
TV-out to 640x480, the signal is a real NTSC 525 lines with ~480
displayable but the X display area is probably from scanline ~10
through scan line ~470 so that it will fit inside the screen. When
you playback recorded video, scan line 8 is displayed somewhere
around scan line 17, for example. Because the recording and play-
back aren't truly aligned, you would see the jagged edges.

==========       ==========              ----======----
==========           ==========          ----======----
==========       ==========              ----======----
==========           ==========          ----======----
==========       ==========              ----======----
==========           ==========          ----======----


Square         Square in motion   Deinterlaced square in motion

If a square is in motion, the image "Square" doesn't exist
anywhere in the NTSC signal and there is no way to 'fix' it. It
can only be covered up by blurring or blending or lowering the
image resolution in some way. In this example linearblend averages
adjacent scanlines so the rough edges can't be seen. However, this
blurs the overall image and there are two ghosts in each frame
which can be nearly as annoying as the jagged edges.

BTW, all graphics cards that have tv-out to NTSC output an NTSC
compliant interlaced signal. If it was not interlaced, there would
be nothing but garbage on the TV screen. Apparently nVidia drivers
do not support sending SVGA signals synced for interlaced modes
of computer monitors that support interlaced signals. This may have
been mis-interpreted as having something to do with interlaced
artifacts on tv-out because nVidia does not "do" interlace. This
is false. The NTSC output from nVidia cards is interlaced and
driver support for SVGA interlaced modes will have no effect
whatsoever on recorded television playback over tv-out.

--  bjm


More information about the mythtv-users mailing list