[mythtv-users] OT: 4K TV. But why (yet)?

Michael T. Dean mtdean at thirdcontact.com
Fri Dec 4 13:13:02 UTC 2015


On 12/03/2015 02:23 PM, Eric Sharkey wrote:
> On Thu, Dec 3, 2015 at 8:40 AM, Michael T. Dean wrote:
>> FWIW, there's more information in a 1920x1080 pixel image than can be
>> displayed with a 1920x1080 pixel display--it actually takes an output
>> display of almost 2x the sample width and 2x the sample height to render all
>> the information provided by an image.
> Sorry, no.  The information isn't in the image.  When you upscale an
> image to a higher resolution, you can employ all the fancy edge
> detection and other enhancements you want, but you're extrapolating
> and guessing when you do that.

Who said anything about upscaling an image to a higher resolution or 
about edge detection or enhancements?  I was talking about image 
processing.  I simply said that a 1920x1080 display cannot represent all 
the information contained in a 1920x1080 image.  What you're forgetting 
is that the 1920x1080 image is nothing more than a bunch of samples of a 
real-world scene.  Those samples approximate the scene, and can be used 
to reconstruct the scene.  Displaying them in a 1:1 mapping 
"reconstructs" the image, not the scene.  In short, the real world isn't 
digital.

An ideal digital image is a 2D rectilinear array of samples, where each 
sample is an infinitely-small point sample of a scene at a given 
infinitely-small moment of time.  Images taken by cameras are taken with 
finite-sized sensors with limited intensity, spatial, and temporal 
resolution (and, yes, temporal resolution comes into play when taking 
still images--not just video).  Image processing can be used to 
reconstruct the original scene described by that physically-limited 
camera's samples.

>    What you're looking at is basically
> equivalent to an artist's rendering.

Funny, but you have that backwards.  An artist's rendering at 1920x1080 
can be displayed at 1920x1080 and there's no more information to be 
found (however, note that this is not necessarily true of a 
computer-rendered image of a computer-modeled scene).

>    It may look good, but it's not
> information from the original image.  1080p upscaled to 4K is not real
> 4K.

For the most part, I agree.  The part you cut out from my original post 
said that no hardware/software can actually extract all the information 
about the original scene for display--though they do get some--and they 
use cheats and bad algorithms in their upscaling which may actually make 
things look worse.

But, hey, feel free to believe what the marketers tell you, pay more for 
those 4k sources (and re-purchase all your Blu-Rays in 4k), and enjoy 
your not-an-artist's-rendering world if you like.  I will admit that 
doing so will spare your having to put up with the bad scaling performed 
by some TV image processors.

Anyway, I'm going to go watch a DVD on my CRT--since you've told me it 
will look best at 720x480 resolution, and I don't have any 720x480 LCDs, 
but my old CRT will actually adjust the scanner to output at 720x540, so 
I can have full screen width--rather than watching in a small window on 
my 1920x1080 LCD.

Mike


More information about the mythtv-users mailing list