[mythtv-users] Just let my TV de-interlace

Tom Dexter digitalaudiorock at gmail.com
Sat Nov 1 15:14:28 UTC 2008


On Mon, Oct 27, 2008 at 3:52 PM, Preston Crow
<pc-mythtv08a at crowcastle.net> wrote:
>> > Are there modlines for an nvidia card to do 1080i over DVI?
>
> Everything that I've heard indicates that nVidia simply doesn't do
> interlaced output correctly.  You get some weird tearing or other
> problems because the output frequency isn't right.  Earlier drivers had
> much worse problems.  If, however, you have a FX5200 card (AGP), and you
> don't mind being unable to upgrade past kernel 2.6.22.xx and
> xorg-server-1.3, then you can run the 8774 nVidia drivers, and
> everything works perfectly.
>
>> > This way I could just do away with any cpu wasted on de-interlacing,
>> > or would there be a major downside to this that I'm not seeing?
>>
>> Most TVs' deinterlacers are going to be inferior to MythTV's own
>> methods
>
> But my TV is interlaced, so any deinterlacing will just make the picture
> worse (not to mention that my AMD 2500+ will run out of CPU if I
> deinterlace).
>
> Are there any cards that work with current drivers to do interlaced
> output?  I'm thinking of upgrading my system, and it seems that nVidia
> is out.
>
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
>

I feel your pain.  I have a Hitachi RP CRT and am forced to
deinterlace everything.  nVidia has never acknowledged that this bug
exists and apparently never will despite endless reports of it on
their forums over the years.  The second there is a good alternative,
I'd go with it and never buy another nVidia product ever again.
They're responsible for the only significant flaw in a rather
expensive HTPC system and I'm not happy about it at all.  If there are
any other options to nVidia for interlaced output via DVI I'd love to
know about them...but as far as I know there aren't.

There's no possible way I'd use an ancient driver/kernel to get around
it either.  It's a moot point anyway as my frontend has PCI Express
only, which rules out all but newer chip sets.  The best I've been
able to do is to run with the patch that allows bob x2 with 1080i
output, which actually looks very good, but it'd certainly be better
with proper interlacing in the first place, not to mention the huge
difference in CPU usage.

I believe you're 100% correct about the output frequency.
nvidia-settings always reports 60.05.  In my case this causes a
picture that appears perfect but which seems to drift out of sync at
even intervals causing awful tearing.

Maybe we should all start hammering the nVidia linux forum on a daily
basis until they simply can't ignore this anymore.  Hell...if they'd
at least admit it's a bug they won't fix I'd give them more credit
than I do now.

Tom


More information about the mythtv-users mailing list