[mythtv-users] 720p harder to decode than 1080i?
Preston Crow
pc-mythtv06c at crowcastle.net
Mon Nov 20 17:02:22 UTC 2006
On Mon, 2006-11-20 at 11:41 -0500, Dan Wilga wrote:
> At 8:55 AM -0500 11/20/06, Mark Chang wrote:
> >I am assuming 720p is harder to decode than 1080i? I ask because it
> >turns out that my recently-built HDTV Myth box doesn't have enough
> >horsepower to play back some OTA recorded streams.
>
> Because of the interlacing, the data stream for 1080i consists of
> only 540 lines/frame, wheres 720p is truly 720 lines/frame. So 1080i
> actually contains less detail per frame than 720p. That would explain
> the difference in file size and processing requirements.
You have it backwards:
1080i is 540x1920 pixels per frame
720p is 720*1280 pixels per frame
So a 1080i frame is 1036800 pixels, and a 720p frame is 921600 pixels.
So 1080i is more pixels per 60th of a second frame.
Of course, that's all irrelevant, because what matters is the bit rate,
which depends on how lossy the station decides their compression should
be. It's also different on different shows on the same channel.
So there are no hard and fast rules when you're dealing with
compression. There are typical bitrates that different networks are
known to use, and that's the best you can do.
More information about the mythtv-users
mailing list