[mythtv-users] VGA -SCART and interlace (2x)

Paul Gardiner lists at glidos.net
Sun Sep 20 17:26:36 UTC 2009


John wrote:
> Thanks for the reply.  What I was doing was moving between autodetect, 
> interlaced and progressive under the menu. The picture looks the same 
> with interlace (x2) and Progressive (no -interlacer ) set. Same affect 
> when I choose none as the interlacer under the TV settings menu.
> 
> Looks like I need another graphics card,

I don't think you've quite got the point I was making. The best quality
you can get when playing PAL content to a PAL CRT TV is by using a PAL
video mode and turning off all scaling *and* deinterlacing. Hence "none"
is what you want really. With "none", though, the playback can be
unstable, jumping between two modes: one perfect, one with ghosts
on moving objects. The only purpose of "Interlaced x2" is to make
playback stable while achieving the same image quality as "none".
It never gives better quality than "none"; it just avoids the bad
mode that "none" can fall into.

If "none" works stably then you don't need "Interlaced x2". Just stick
with "none". How does
your image quality compare with watching analogue directly through
the TV? If they are similar in terms of sharpness then you probably
already have the best quality you can out of SD content.

If on the other hand MythTV playback looks blurred compared with
analogue then I suspect you've run into the nVidia interlaced-mode
problem.

> or an Big LCD Tv :-).

If you just wish to playback SD and you have a bigish good quality CRT
then you might not see any improvement from an LCD, especially in
terms of colour accuracy.

Cheers,
	Paul.



More information about the mythtv-users mailing list