[mythtv] re: Any graphics cards support interlaced TV-out?

Istvan Noszticzius noszti at selway.umt.edu
Sun Jul 13 21:58:28 EDT 2003


wd at pobox.com wrote on July 11 2003:

> I know a lot of users seem to be happy with the nVidia line of
> cards. Does the TV-out on these cards support "true" interlaced
> output? Or is my best bet to get some sort of external VGA -> TV
> converter that will retain the video interlacing.

 I'm not sure about external VGA->TV converters, but the Nvidia card that
I have (GeForce4 MX 440) - and I think that is true for all NVidia cards -
does not support interlaced mode under Linux (the driver is broken?). See
these threads on NVnews.net's Nvidia Linux forum, they (NVidia) claim they
are working on it...
http://www.nvnews.net/vbulletin/showthread.php?threadid=10976
http://www.nvnews.net/vbulletin/showthread.php?threadid=11147

 Rant:

 Maybe Matrox cards have better support. It would be nice to have proper
interlaced output, as I capture with half NTSC resolution (x240) to avoid
the interlace issue. Interestingly even with this I do sometimes get some
interlace-like artifacts in the form of a glitch in the video: as if the
wrong frame was inserted in the video stream - off by 1. I'm not sure if
this is the encoder's fault or the capture card gets the "wrong" interlace
field every now and then?

 Of course this low resolution does not allow for a very good video
capture sharpness (although I get more bits this way for high-motion
scenes, which at MPEG4/1800kbps actually it is not that bad). Also, I have
the interlace problem when playing some DVDs (e.g. TV series or DVD extras
that are not progressively encoded) on the same system (I use xine which
has de-interlacing options, but I would prefer to not use them as they do
sometimes produce quite visible artifacts).

 Cheers,
 István





More information about the mythtv-dev mailing list