[mythtv-users] Resolution

Jim Paris jim at jtan.com
Wed Dec 3 10:59:48 EST 2003


> Untrue. Bitrate is rather meaningless without resolution. The bitrate 
> represents the amount of data available to encode the pixel information. For 
> any given resolution and bitrate you can calculate the number of bits per 
> pixel.

Similarly, you can calculate "bits per square mm on your TV", and it's
the same whether you're doing 1/2 D1 or 720x480 captures.  The
"resolution" that matters is what's displayed in the end, not what's
encoded in the file.  You're still providing 2500kbps of information
over a fixed area, and that's not going to change with capture size.

> The image is roughly equivalent in quality to D1 at twice the
> bitrate. Since any number of pixels greater than 1/2 D1 will be
> effectively "wasted" on a TV display

Let's assume for a moment the TV is pixel-based at 1/2 D1, and you're
displaying D1.  If you just discarded every other horizontal pixel,
then yes, your end result is fewer bits per displayed pixel.  But
presumably you're doing a smooth rescale that combines each pair of
pixels, giving you the exact same number of bits per displayed pixel.

> Put another way, my TV can't show me 720x480, it's simply doesn't have that 
> many lines of resolution. 

Mine does. :) And so I can actually see the blurring caused by
capturing at lower resolutions.  It's the exact same appearance as if
I had captured at higher resolutions and then just applied a smoothing
filter.  In your case, your TV has a lower bandwidth and effectively
does a smoothing in all cases.

> There is no "extra scaling" introduced. When I play my recorded video on my 
> TV, it's at the stream's native resolution.

Depends on how you're doing your video output.  I know a lot of people
are using the TV-out on their graphics card, so it needs to go out at,
say, 640x480, and will look something like this:

Capture at 352x480 -> MPEG -> PC scales to 640x480 -> TV "scales" to 352x480

Where it could just as well be

Capture at 640x480 -> MPEG -> TV "scales" to 352x480

(where "tv scales" = "applies a lowpass, reducing effective resolution")

> The signal quality of most analog cable TV sources in the US is
> about VHS quality or slightly better, commonly about 280 lines. So,
> 352x480 is actually overkill for my cable source, which is about
> 352x280.

I get digital cable here, so it's a little better for me.  I do agree
that capturing at too high a resolution for your signal is likely to
pick up noise and therefore reduce the effective data rate that the
MPEG devotes to the correct signal, although something like quickdnr
would fix that.

BTW, "lines" typically refers to horizontal resolution, so I think you
mean effectively "280x480"; your cable source is certainly still
pushing through valid NTSC. :)

> It's certainly a waste to use more than half my bitrate to encode at
> 720x480!

I'm still unconvinced -- you're not wasting these pixels unless you're
throwing them away.  If you pushed them all to the TV, they would
still all contribute to the image (even though your TV will apply a
lowpass filter).

> Gee. That was long-winded. Sorry. I hope it's clear enough.

Your argument is clear but I respectfully disagree. :)

I do agree that there are _some_ differences between capturing
resolution.  For example, when talking "bits per pixel" with regards
to (uncompressed) information content, there's going to be a
difference between whether the information is devoted to resolution or
color.  I have two images, one 720x480x4bpp and one 360x240x16bpp
image.  If either is displayed on a 360x240 display, you'll see the
same thing.  But if either is displayed on the 720x480 display, then
the 720x480x4bpp will look sharper with less color detail, while the
360x240x16bpp will look smoothed and have more color detail.  Of
course, I could always smooth the 720x480x4bpp image and get the same
result.

-jim


More information about the mythtv-users mailing list