[mythtv-users] 720p harder to decode than 1080i?

Mark Lehrer mark at knm.org
Mon Nov 20 17:27:55 UTC 2006


>> I am assuming 720p is harder to decode than 1080i? I ask because it
>> turns out that my recently-built HDTV Myth box doesn't have enough
>> horsepower to play back some OTA recorded streams.

> Because of the interlacing, the data stream for 1080i consists of
> only 540 lines/frame, wheres 720p is truly 720 lines/frame. So 1080i
> actually contains less detail per frame than 720p. That would
> explain the difference in file size and processing requirements.

Unless of course you are deinterlacing with Bob 2x, which I'll bet a
majority of HDTV users should be doing... there aren't a lot of truly
interlaced displays being sold anymore.

The simple rule is - if you are doing hi def, you really should go
with a dual core processor because both X and MythTV are quite busy.
Maybe you can squeak by with a single core if you are watching 24fps
content, but these days it just isn't worth it.

Mark


More information about the mythtv-users mailing list