[mythtv-users] Just let my TV de-interlace

Preston Crow pc-mythtv08a at crowcastle.net
Mon Oct 27 19:52:05 UTC 2008


> > Are there modlines for an nvidia card to do 1080i over DVI?

Everything that I've heard indicates that nVidia simply doesn't do
interlaced output correctly.  You get some weird tearing or other
problems because the output frequency isn't right.  Earlier drivers had
much worse problems.  If, however, you have a FX5200 card (AGP), and you
don't mind being unable to upgrade past kernel 2.6.22.xx and
xorg-server-1.3, then you can run the 8774 nVidia drivers, and
everything works perfectly.

> > This way I could just do away with any cpu wasted on de-interlacing,
> > or would there be a major downside to this that I'm not seeing?
> 
> Most TVs' deinterlacers are going to be inferior to MythTV's own
> methods

But my TV is interlaced, so any deinterlacing will just make the picture
worse (not to mention that my AMD 2500+ will run out of CPU if I
deinterlace).

Are there any cards that work with current drivers to do interlaced
output?  I'm thinking of upgrading my system, and it seems that nVidia
is out.



More information about the mythtv-users mailing list