[mythtv-users] Help with xorg.conf

Michael T. Dean mtdean at thirdcontact.com
Thu Dec 22 16:18:47 EST 2005

Allan Wilson wrote:

> I do have a VGA input on my TV but it will not take up the full screen 
> if you use it.

It will accept a signal over VGA, but because *all* TV's (including 
LCD's/DLP's/SXRD's/...) have been designed to include overscan (i.e. if 
you display 1920x1080 pixels in a 1:1 pixel-for-pixel format, part of 
the image will be displayed "off-screen"), the manufacturer probably 
took this into account for their setup.  Here, I'm guessing your TV is 
like mine: because loss of the top/bottom/sides of a computer screen 
(which contains Start bars/Quick-Launch toolbars/Dock's/Task 
bars/whatever your WM calls them) makes using a typical computer 
difficult (and annoying), the manufacturer probably designed all VGA 
input to be scaled.

My TV takes any input from the VGA port and scales it to 95% of the 
visible display size.  So, I just tell the TV to scale the image to take 
up the full screen (including overscan since I'm using a Myth box and 
not Windows with the Start bar).  This means I have very close to 1:1 
pixel mapping and full screen/resolution.  Only tradeoff is overscan 
(but, you can only have 2 out of 3, so you have to choose which to give up).

> Refering to what you where talking about Michael should I get a cable 
> to convert the VGA to something else?

Nope.  Plain VGA.

> If so what input should I shoot for?

VGA.  ;)

> I do like the idea of using the VGA output anyhow b/c every computer 
> already comes with this.


> Even if I use the VGA out won't I still have to tell it the specs for 
> my Display with a modeline since it is not standard?

If you plug in a VGA cable (either a cheap one or some high-end ones), 
the digital display will be able to transmit EDID informaion, which X 
can use to choose the appropriate display size.  (Some cables--even some 
high-end cables--block the EDID info.)  Therefore, you can just plug it 
in and it should work--if not, try another VGA cable (if you have one 
handy, but don't buy one just for testing) or specify a modeline.  If 
you're using some fancy Linux distribution, it probably has all the 
tools you need to configure X (i.e. just mark the appropriate checkboxes 
for the resolutions you want and it computes the modelines).

IIRC, my TV tells the computer it will accept a 1280x1024 or an 800x600 
image, but the manual lists several other (normal computer) modes 
(including 640x480 and 1024x768) all at 60Hz refresh, so any "standard" 
modeline for 1280x1024 at 60Hz would work.  However, since I wanted full 
1920x1080, I specified my own modeline (technically, John P. Poet's 
modeline--thanks, John ;).  To find yours, check out that avsforum link 
I gave you.

> Steve, I have seen where some people are using different resolutions 
> for different content but how do you control this? Also if you are 
> using Myth in on a widescreen display what do you use for the theme. 
> Thanks for the help I think I am starting to understand a little better.

IMHO, you don't want to use Xrandr.  If you use Xrandr, Myth can select 
the best available mode (from those you've configured) for use in 
displaying your video.  Myth changes X to use a different video mode and 
it's up to the TV to scale the image--however it decides is 
appropriate.  I am a big believer in being able to control how things 
are scaled (i.e. let Myth recognize a 4:3 video I recorded with my 
PVR-x50 and put black bars on the side while allowing me to change the 
aspect ratio /through Myth/ when appropriate--i.e. letterboxed video in 
4:3 content can be displayed full screen by using 16:9 zoom--instead of 
having to set my TV's scaling mode to get the display I want).  Power to 
the people and all...  :)

You may be able to argue for using a component/DVI connection to send 
video at the video's native resolution, but doing so means you can't do 
1080p60 (since only the VGA port can accept that), so your GUI (and any 
1920x854 at 24fps progressive high-def movie trailers) will "suffer" (not 
too bad for the GUI--you could use 1280x720 for it, but it will affect 
your ability to do the movie trailers justice).

However, based on what I've seen of the abilities of the ATI Xilleon 
processor (which my TV uses and yours probably does, too), I still think 
you're better off doing your own scaling.  The Xilleon is great at 
serving it's intended purpose--real-time decoding of high-def video--but 
leaves much to be desired in the scaling arena...


More information about the mythtv-users mailing list