[mythtv-users] OT: Why 1080p?

Rob Baumstark rbaumstark at gmail.com
Fri Nov 10 02:09:11 UTC 2006


On 11/9/06, Chris Ribe <chrisribe at gmail.com> wrote:
> > > 1080p displays are the only ones that can display both 720p and 1080i
> > > at their native resolutions.   Granted, I don't know of anyone who
> > > uses their 1080p set to watch 720p content w/ black bars on 4 sides,
> > > but it can be done if you don't like upscaling.
> >
> > And lets explicitly un-say it, since thats a load of crap.  1080p
> > displays are the ones that have a display with a native resolution of
> > 1920x1080 pixels.
> >
>
> And it is perfectly possible to display 720p content at its native
> resolution on such a display.  Hook your myth box up to your 1080p
> set,  navigate  to Utilities/Setup -> Setup -> Appearance and set the
> GUI width to 1280 and the GUI height to 720.
>
> Now, playback your favorite 720p content and enjoy it as its native resolution.

I didn't say a 1080p display can't display a 720p signal at native res
- obviously it can with lots of black bars.  What I did say was that
nothing about 720-anything has anything to do with the definition of a
1080p display, which is what your first e-mail that I replied to said.

> > 480p = 720x480    = 345600 pixels per frame,  @60hz = 20736000
> > pixels/second = ~20mpixels/sec
> > 720p = 1280x720   = 921600 pixels per frame,  @60hz = 55296000  = ~55mpixels/sec
> > 1080i = 1920x540  = 1036800 pixels per field, @60hz = 62208000  = ~62mpixels/sec
> > 1080p = 1920x1080 = 2073600 pixels per frame, @60hz = 124416000 =
> > ~124mpixels/sec
>
> Bitrate is not a definitive measure of video quality.  Information
> that cannot be percieved by the human eye/brain doesn't add to
> quality.
>
> Consider a single pixel refreshed at a rate of 4Ghz, for a data rate
> of 4000mpixels/sec.  Clearly, this resolution is superior to even
> 1080p60.  Yet, I find staring down the end of a fiber optic cable less
> immersive than even a YouTube video.

Again - I didn't say this was the best way to measure video quality.
This is however a good chart for all those people who think that 1080i
= 540p and are using that assumption to backup their "720p is better
than 1080i" statements.  Those are also not bitrate measurements, but
resolution measurements.  Just as digital camera's are compared by
their megapixel count (not a good comparison, but easy and
convenient), TV can be compared by it's megapixels per second, like a
digital camera over time.  It's also not really a great comparison,
for exactly the reason you listed.  But it is easy and convenient and
far more accurate than your single-pixel example tries to imply -
anyone can pull extreme examples out of their ass, it only proves they
have an imagination.


More information about the mythtv-users mailing list