[mythtv-users] OT: Why 1080p?

Ed Gatzke ed.gatzke.groups at gmail.com
Fri Nov 10 22:47:50 UTC 2006


> You would have scaling issues there too, because scaling a 720p signal
> to 2160p would have to create pixles "out of nowhere", which is only
> just asking for noise. Sadly, there is no perfect solution other than
> a display that can output at whatever framerate it is given and for
> all sources to be of the same pixel resolution. I vote 1080p for
> everything ;-)
>
>
You could always just triple or double the size of each pixel, so you still
get a pristine 720p or 1080p image, just with bigger pixels made up of four
or nine smaller pixels.

Of course, that would be jaggy at some level, so you can use the extra
resolution and smooth out the jaggies.  I think that is what anti-aliasing
is doing on your GPU, but there I think they create the geometry at the high
resolution and downsample to the display resolution.  For a giant screen
running at quad1080p, this wouuld be taking a lower-res signal and upscaling
it to a higher res display.

I think the only place you will really see problems upscaling are with crazy
test images, like geometric shapes and line test patterns.  Normal stuff you
should not see too much of a problem when scaling.  Maybe it blurs some
things, but I doubt you would see it much in motion.

Nobody has even mentions sub-pixel rendering yet...  Technically your
1920x1080p display is actually (3*1920)x1080p due to RGB pixels.  You could
smooth out the picture there too, given enough gpu

I read somewhere the resolution that would be discrenable to the human eye
would be around 4000x2000, which would be quad 1080p or so...  One day.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mythtv.org/pipermail/mythtv-users/attachments/20061110/eb2c3757/attachment.htm 


More information about the mythtv-users mailing list