[mythtv-users] nVidia interlace problem

Paul Gardiner lists at glidos.net
Thu Nov 13 15:39:47 UTC 2008


Alex Butcher wrote:
> On Thu, 13 Nov 2008, Paul Gardiner wrote:
> 
>> I've seen some posts about problems with interlacing on nVidia
>> cards, but I've been unable to pick up exactly what the problem
>> is. Please can someone give a brief explanation? Is it only
>> a problem with TV out, or would cause problems with the VGA to Scart
>> trick? Is there any signs of it's being fixed?
> 
> I'm using a homemade VGA-to-RGB-SCART that generates composite sync from VGA
> sync signals. My card is a generic GeForce 4 MX440 model, and thus I'm
> forced to use nVidia's legacy drivers (currently 96.43.07). Normal graphical
> output is properly interlaced using a 720x576 at 50Hz PAL modeline; things like
> the MythTV UI are razor-sharp.

Yeah, I know how good it can look. I have an old Radeon 9000 working in
this mode perfectly. It's just too big to fit in a low profile case (I
wish I could fold it! :-) ).

> Output to Xvideo (e.g. MythTV playback) appears to have every other scanline
> doubled, so I need to use the linearblend deinterlacer in order to get
> correct shapes (e.g. BBC Four logo is excessively aliased without
> deinterlacing).

Damn! Game over I guess. Don't understand the description quite: in what
sense doubled? You said about the legacy drivers - heard anything about
this being cured in later drivers? I was looking at using onboard
7050PV. Maybe I would be able to use later drivers.

That must be so frustrating. So close and yet so far.

Cheers,
	Paul.



More information about the mythtv-users mailing list