[mythtv-users] Re: Re: Best video capture resolution for output
to TV?
Bruce Markey
bjm at lvcm.com
Fri May 16 17:03:48 EDT 2003
Ray Olszewski wrote:
...
> But his reply got me wondering ... to what extent is the benefit of
> increased horizontal dot density limited by encoding parameters? For
> example, if the encoding quality is set at, say, 10 Mb/minute, I'd
> expect that increasing horizontal dot density will at some point hit the
> limit of encoding quality (or maybe CPU speed, but let's put this part
> aside for now). In such a case (if the CPU permits it), would increasing
> encoding quality to, say, 20 Mb/minute allow increased horizontal dot
> density to show improved quality? At what point does the resolution of
> the NTSC signal *itself* impose a limit?
Good point (if I undersood you correctly =). The bit rate
also has an impact on how much detail is preserved during
compression. In testing I found that given a medium res and
medium bit rate, raising the bitrate improved the picture
quality more than raising the resolution to hit a target
file size.
> At what point does the resolution of the NTSC signal
> *itself* impose a limit?
You lost me on this question. However, you may want to
walk up to your TV and look at it from a few inches away.
You'll see that the picture isn't nearly as good as our
eyes are fooled into believing when we watch from a distance.
There are limits to the acuity of the eye so broadcast
equipment and TVs are designed to a tolerance where flaws
aren't appearent from a viewing distance of about seven
times the screen height away.
-- bjm
More information about the mythtv-users
mailing list