[mythtv-users] nVidia interlace problem

Paul Gardiner lists at glidos.net
Tue Nov 18 11:46:05 UTC 2008


Alex Butcher wrote:
> On Thu, 13 Nov 2008, Paul Gardiner wrote:
> 
>> I've seen some posts about problems with interlacing on nVidia
>> cards, but I've been unable to pick up exactly what the problem
>> is. Please can someone give a brief explanation? Is it only
>> a problem with TV out, or would cause problems with the VGA to Scart
>> trick? Is there any signs of it's being fixed?
> 
> I'm using a homemade VGA-to-RGB-SCART that generates composite sync from VGA
> sync signals. My card is a generic GeForce 4 MX440 model, and thus I'm
> forced to use nVidia's legacy drivers (currently 96.43.07). Normal graphical
> output is properly interlaced using a 720x576 at 50Hz PAL modeline; things like
> the MythTV UI are razor-sharp.
>
> Output to Xvideo (e.g. MythTV playback) appears to have every other scanline
> doubled, so I need to use the linearblend deinterlacer in order to get
> correct shapes (e.g. BBC Four logo is excessively aliased without
> deinterlacing).

I've tried VGA to Scart with an FX5200 now, with Minimyth, which
currently uses the 169.12 drivers. I get exactly the same problem.

> Configuring MythTV to use OpenGL or plain X11 (NO_XV=1 usr/bin/mythfrontend)
> /output for playback results in correct interlaced output (i.e. as sharp as
> the UI), but performance suffers so badly on a P4 2.53GHz machine, even with
> SD material, as to be unusable. mplayer using OpenGL output is usable.

Same here. I set up a profile to use ffmpeg and opengl. Possibly the
best picture quality I've seen out of MythTV yet, but the CPU can't
quite keep up (I think it's a 2200GHz Athlon 64 939 socket).

This is so frustrating. So what do we do? Does anyone know if this
ever got fixed in any driver version, and which cards/chips we are
restricted to? Or is it just best to avoid nVidia, and stick to ATI
or Intel?

Cheers,
	Paul.



More information about the mythtv-users mailing list