[mythtv-users] Struggling with Xwindows DVI to HDTV 1080i

Len Reed crunchyfrog at charter.net
Thu Dec 29 01:16:15 EST 2005


Steve Adeff wrote:
> On Tuesday 27 December 2005 18:38, Len Reed wrote:
> 
>>I've got an nvidia 6200 with DVI out connected to my Mitsubishi HDTV
>>(DVIS to HDMI cable).  The TV does 720p and 1080i on HDMI: it's worked
>>from both the cable box and from a DVD player that does upscaling.
>>
>>I can get the TV to recognize that it's getting 1080i input from the
>>computer.  (The info on the screen says so.)  I can't get it to deal
>>with 720p for some reason.  I can get the TV to handle lower resolution
>>SVGA and XGA modes up to 1024x768 fine.
>>
>>With 1080i I get what is close to the twm screen, but there are two
>>problems:
>>1. The screen is greatly overscanned.  Perhaps 20% is not displayed.
>>2. The interlacing is off, or at least that's my guess.  Everything is
>>displayed twice, with one flickering image directly below another.  They
>>are close: the bar at the top of an xterm has its two images overlapping.
>>
>>I've tried every modeline I can find, and have tried two different
>>modeline calculators, but I can't get the two images to converge.  The
>>TV seems to be reporting things correctly to X (59-61 Vsync, reasonable
>>Hsync, etc.)  Telling X to ignore the TV's info doesn't help in any case.
>>
>>It seems like it should be easy enough to play with the vertical
>>blanking interval to fix this, and that I'm close.  But I'm guessing,
>>and I'm not making progress.  Is there a reasonble way to tweak the
>>modeline to iterate toward a solution here?
>>
>>Details:
>>Fedora core 4, x86_64
>>Athlon-64x2 (3800+)
>>ndvidia 6200 card
>>latest nvidia X driver, compiled on the machine
>>
>>Thanks,
>>Len
> 
> 
> newer nvidia drivers don't support interlace modes over DVI.

Seriously??  I sure didn't see that in their README.  After my original 
posting, I bought (from somewhere that I can return it) a 6600 card that 
has component video out and it works fine.  (It exhibited exactly the 
same problem with DVI, though.)  The card with HDTV encoding is a 
satisfactory solution if not an ideal one (both technically and in 
cost.)  It sure seems stupid to have the card encode HDTV to have the TV 
turn it back into digital for the DLP display when I should be able to 
do it over DVI.  Certainly the cable box's 1080i DVI is a bit clearer 
than its component video.  Is there any way that the open source (dv) 
driver will work at 1920x1080i to DVI or is a waste of more time to even 
try?
> 
> Overscan won't change what you see for TV (ie. the overscan is the same 
> whether from your computer or your cablebox), so change the gui overscan 
> settings for your TV to fix the gui. if you want to fix tv video, SVN lets 
> you adjust overscan, or find the service menu info for your tv to lessen its 
> overscan.

OK, I haven't laid mythtv into this mix yet, so I'll worry about it 
then.  The component video from the new card at 720p and 1080i exhibits 
only small overscan, about what I'd want.

While you're listening, Steve, you were complaining about the VIA 
IEEE1394 chipset.  I can't find anything but VIA.  Fry's had a dozen 
cards, all VIA.  Did you have to mail order to get something with the TI 
chip?  Do you have a recommendation?

Thanks again for the help.

Len


More information about the mythtv-users mailing list