[mythtv-users] DVI connection overscan

Raphael rpooser at gmail.com
Sat Apr 14 18:58:20 UTC 2007


Daniel Agar wrote:
>> Alex Malinovich wrote:
>>     
>>> I have my Myth box hooked up to my TV via a DVI-to-HDMI converter cable.
>>> The video card feeds DVI out, and the TV takes HDMI in. What's strange,
>>> however, is that I still get a fair amount of overscan on the TV. I had
>>> thought that in the case of DVI and newer connections the full signal
>>> resolution is sent to the TV as part of the signal so no overscan is
>>> needed.
>>>
>>> I've overcome this with Myth by manually scaling the window, but I'd
>>> prefer to not have to do that and to instead have the full screen
>>> properly displayed. I had heard that the nvidia-settings utility has
>>> overscan compensation, but I'm pretty sure that that's only when using a
>>> TV-out signal, not a direct DVI signal. (If that setting is still there
>>> with a DVI connection I certainly can't find it, and the interface
>>> really isn't all that complex.)
>>>
>>> So anyone know what's causing the overscanning and how to fix it?
>>>
>>>
>>>       
>> I also have a DVI-HDMI cable hooked up to my TV, and at first there was
>> an insane amount of overscan. Messing around with the modeline you can
>> get rid of some of it. I used xvidtune and found a modeline I liked. It
>> still doesn't get rid of overscan completely; it seems that's either
>> impossible or just too hard for me to find the magic numbers. The other
>> way you could do it is by going into your TV's maintenance menus; some
>> have a function to actually change the overscan there. It is the TV's
>> and not your computer's fault the overscan is there after all.
>> However, I wouldn't recommend futzing with the service menus - you can
>> "brick" your TV if you do something wrong. I just accept the overscan
>> and set the myth UI to take up 100% of the visible screen. Works great,
>> since all I ever see on that input is myth anyway.
>>
>> Raphael
>>     
>
> I found the same problem when I first connected my geforce fx 5200 to my
> sharp aquos through dvi->hdmi. I ended up using powerstrip in windows on
> another machine with the same video card. I started with the native
> resolution of my screen then cropped the picture within powerstrip and it
> gave me a modeline suitable for use with X. Overall it works pretty well,
> except I've found certain versions of the nvidia driver seem to ignore my
> custom modeline.
>
>   


I've tried the powerstrip in windows approach, too. But had no luck. I 
was much confounded by the fact that the modeline I had in powerstrip 
then didn't look the same at all when I put it into xorg in linux.
That was when I was using an ATI card though. Not sure if it would work 
differently for Nvidia.
Raphael


More information about the mythtv-users mailing list