[mythtv-users] [OT] 1080i

Andrew Gallatin gallatin at cs.duke.edu
Tue Mar 1 01:48:45 UTC 2005


I'm looking for some basic advice from others who are running at
1080i.  I've got a direct view CRT HDTV which takes DVI and component
input.  I'm trying to get 1080i working from my myth box (running FC3).

I've tried the following video cards over DVI with the following results:

o Radeon 9200SE: No video at all (even during POST, grub, and early
boot), but I can get DDC info, and the X server thinks its talking to
the display.  I pulled this out of my desktop where it normally talks
to a Dell 2001FP via DVI.  It is weird how there is no video at all.

o Matrox G550: No video before X starts, just like the Radeon.  However,
X works in 1080i.  But xv doesn't work right.  A 1080i clip is
displayed in a tiny (640x480?) window in the upper left corner with
strange distortions.  I tried mplayer -vo x11 on a 1080i clip, and it
looks wonderful, but my 3.0GHz P4 is is just too slow to keep up at
1080i using just x11.

o Nvidia 5200FX: Video present during POST and early boot.

Using the nvidia proprietary drivers, 1080i shows the horrible
interlace bug that others have complained about.  A hand crafted 720p
mode works, but I'm not at all happy with the quality of 1080i
material displayed at 720p.  It is no where near as good as the
integrated tuner, and nowhere near as good as the matrox.  Plus it
just seems silly for myth have to de-interlace and scale 1080i down to
720p on a set whose native mode is 1080i.

I've been running myth using just xv, and its fine with that, so I'd
be happy to give up xvmc if I thought there was a way to get the
opensource nv driver to do 1080i.  Has anybody tried that?  When I try
using the nv drivers I can't get anything resembling HD to work.  I
haven't gone as far as rebuilding them from source and peppering the X
server with printfs to see why it rejects modes (bad mode
clock/interlace/doublescan) which the other drivers liked just fine.


Alternatively, the thread earlier this month suggests an Nvidia
6600GT should be able to do 1080i on its built-in component out.
But those cards all seem to have fans, and are expensive.

So this leaves me with the Audio Authority 9A60.  Does going out
the VGA port work around the nvidia interlace bug?  I've seen
this implied, but I don't think anybody has actually come out and
said it.   If this works, why are people (from the 6600GT thread)
so eager to ditch their 9A60?

Thank you for your help,

Drew







More information about the mythtv-users mailing list