[mythtv-users] Going From ATI to NVIDIA TV out to get overscan

Raphael Pooser rpooser at gmail.com
Tue Feb 21 19:01:00 UTC 2006


This one is slightly complicated.  Basically I switched TVs on my 
mythbox.  I have switched TVs three times so far.  First, I had an old 
20" GE TV with RF input only, and I had to input my ATI card Svideo 
through an RF modulator.  The picture on this TV was awesome; it was 
better than the TV's own tuner.  The second TV was a philips 30" FlatTV 
LCD screen, with vga input.  This was painless as to linux it was the 
same as hooking up a regular old monitor.

Now, I've switched again to a 27" Sony WEGA crt TV with Svideo input.  
On this TV the picture is just terrible.  The image is so underscanned 
that I have a black strip at the bottom of the screen where the TV out 
signal just isn't sending any info.  When watching TV with the cable 
signal directly into the TV, the image fills the whole screen of 
course.  So, I have a black strip at the bottom, maybe an inch thick in 
X, and at the top of any video I see the little bit of garbage in the 
cable signal that normally gets overscanned out by the TV.  The problem 
is, this Sony TV has way less overscan than my old GE, and unlike my LCD 
it has no adjustments to move the image around to compensate.  So, the 
next step would be adjust overscan in the ATI drivers... but of course, 
you can adjust overscan in the ATI drivers, and then watch as whatever 
"adjustment" you make has no effect whatsoever on the screen.
I recently downgraded to 8.20 for other reasons.  If anyone knows if 
overscan is adjustable in 8.22 or whatever the latest drivers are, 
please let me know.
Or do I now need to chuck this old dog and get an Nvidia card, and if 
so, does anyone know what the best possible Nvidia card for adjusting 
overscan is?  Seems like this forum likes the 5200 a lot.
Thanks,
Raphael


More information about the mythtv-users mailing list