[mythtv-users] Re: Resolution, screen size, and other related 'size' question

Michael T. Dean mtdean at thirdcontact.com
Sat May 21 15:54:34 UTC 2005


Chad wrote:

>On 4/28/05, Chad <masterclc at gmail.com> wrote:
>  
>
>>I see posts and FAQ's and specs that display different numbers and I'm
>>wondering what one means over the other.
>>
>>For example:
>>
>>I have a Samsung DLP Monitor that has the following "specs":
>>
>>It's a 61" 16x9 TV.
>>
>>It has a resolution of 1280x720
>>
>>It supports the following formats:  480i, 480p, 720p, 1080i
>>
>>
>>Any generic explaination would be really nice.
>>
>>2 specific questions I have:
>>In the MythTV recording profiles, to 'get the most' out of my TV, what
>>resolution should I use to record?
>>    
>>
For recording HDTV, you get the resolution in which the program is 
broadcast.  Basically, HDTV tuners receive an MPEG-2 stream, so there's 
no processing done on that stream.  This is ideal since it means the 
quality of the recording is the same as the quality of the broadcast.  
For a 720p broadcast, you get a 720p recording.  For a 1080i broadcast, 
you get a 1080i recording.  (Note that there are actually 18 formats 
defined in the ATSC specification, although the two just mentioned are 
the most commonly used.  See 
http://www.hdtvprimer.com/ISSUES/what_is_ATSC.html for more info).

For recording SDTV, you have options.  Since SDTV is an analog format, 
it specifies a number of lines (480 for NTSC).  Each of these lines is 
analog (i.e. continuous), so there's no concept of horizontal 
resolution.  However, to make the content digital, your tuner card must 
"sample" that analog line along its length, finding the color for a 
pixel at the position of the sample that most closely reproduces the 
original.  As you might suspect, taking two samples along the horizontal 
axis would give you a very bad picture (i.e. trying to reproduce half 
the line with only one color would give you 2 pixels per line, so at 480 
lines would be only 960 pixels--very bad quality).  However, since the 
line is analog, we can sample it as many times as desired.  The more 
samples, the better the quality--up to a certain point.  Once we hit the 
ceiling, adding more pixels no longer improves picture quality (we 
basically plateau and quality remains the same--assuming we continue to 
increase the bitrate with the resolution).  Therefore, the ideal 
resolution is the one at which you've hit the ceiling--it provides 
maximum quality at the minimum required bitrate.

The ceiling is different for different types of hardware.  There are 
many factors involved in determining this maximum, but basically, you 
hit it when you push your hardware to its limit.  For my PVR-{2,3}50's, 
I can't tell a difference in quality above 480x480 pixels resolution 
(assuming appropriate bitrate).  However, I record at 720x480 simply 
because the DVD specification supports only 720x480, 704x480 (cut--i.e. 
no borders/rainbow edges on the video), 352x480, and 352x240 (for 
NTSC--the vertical numbers are 576 and 288 for PAL).  Therefore, by 
recording at 720x480, I get the maximum picture quality (which I achieve 
way back at 480x480) and can burn to DVD without scaling/re-encoding.  I 
have to use a slightly higher bitrate, but hard drives are cheap 
compared to the amount of time/effort required to scale the video after 
recording (even considering how few DVD's I make :).

>>What resolution should I have in my xorg.conf?
>>    
>>
Ideally, since your TV has a native resolution of 1280x720, you would 
use that resolution.  Then, your TV doesn't have to scale the video and 
you get a pixel-for-pixel representation of the output, giving you the 
highest quality possible.  (Note that although some (older/cheaper) TV's 
have a native resolution of 1280x720, they may not be able to accept 
input at that resolution from a computer.  If that's the case for yours, 
you would probably get the best results using the maximum resolution 
allowed.  However, this is most likely not the case with your Samsung.)

>>Related question:
>>I have both VGA and DVI inputs on the TV.  I am wondering if running a
>>vga to vga cable from my videocard to the input will transfer an HD
>>stream, or if I HAVE to get a videocard with a DVI output (not that
>>it's that big of a deal, it'd just be nice to save ~30 bucks ;) ).
>>    
>>
VGA cables can carry significantly higher resolutions than those 
required for HDTV output.  However, VGA cables carry an analog signal.  
Since your computer's video card produces a digital image, it must use a 
digital to analog converter to send that image out the VGA port.  In the 
case of a CRT, this is not a problem because the CRT also uses an analog 
signal, so there's one conversion that must happen anyway.

LCD monitors and digital TV's, on the other hand, need a digital 
signal.  Therefore, if using a digital DVI connection (i.e. DVI-D or the 
digital side of DVI-I--which has both digital and analog signals), the 
video card outputs the image in digital and the display uses the exact 
image created by the video card.  If, instead, you use a VGA connection 
to the LCD or digital TV, your video card does a digital to analog 
conversion, then the display does an analog to digital conversion.  At 
this point, two conversions were performed on the signal--neither of 
which was required.

That being said, analog-to-digital and digital-to-analog conversions 
have been around for a long time, so the converters available today are 
typically very high quality (even the cheap ones).  Therefore, it's 
quite possible that the image produced using a digital DVI connection 
will be indistinguishable from that produced using an analog connection 
(especially for relatively short cable runs).  As a matter of fact, for 
this reason, some LCD/TV manufacturers--needing the ADC to support VGA 
connections, anyway--decided to use DVI-A (analog) or to use the analog 
side of a DVI-I connection to save money by providing a single internal 
path for the signal (whether from VGA or DVI inputs).  The DVI-A signal 
is basically the same as the VGA signal--you can actually convert from 
one to the other using only a cable (for $20 or so)--whereas converting 
from DVI-D to VGA requires a digital to analog converter (DAC) like this 
one for $282 ( http://www.networktechinc.com/dvi-vga.html ).

Therefore, if your TV is using the analog side of the DVI input, there's 
absolutely no benefit to using DVI instead of VGA.  However, if your TV 
is using the digital side of the DVI input, there may be a benefit to 
using DVI (although, that benefit may simply be the placebo effect... ;).

Basically, when you boil it down to basics, the best answer is, "If it 
looks good to you, it's good."

Oh, and one last bit of info.  Since the DVI-D connection is digital, 
any relatively new TV with DVI-D  connection almost definitely has HDCP 
(High-bandwidth Digital Content Protection) built into that connection.  
(But, since VGA connections are analog, they are not (yet) required to 
be protected with HDCP or other DRM.)  While it probably doesn't matter 
now, I'll leave it to you to decide whether that's good or bad for you 
in the future... ;)

>>Thanks!!
>>    
>>
>Sorry, don't know how the original got lost ;)
>  
>
Since I'm the one who gave you a hard time about the lost question, I 
figured I should at least take a stab at answering...  :)

>Thanks for the replies!
>
Hope it helps.

Mike


More information about the mythtv-users mailing list