[mythtv-users] 720p harder to decode than 1080i?

Steven Adeff adeffs.mythtv at gmail.com
Mon Nov 20 23:32:43 UTC 2006


On 11/20/06, Dan Wilga <mythtv-users2 at dwilga-linux1.amherst.edu> wrote:
> At 8:55 AM -0500 11/20/06, Mark Chang wrote:
> >I am assuming 720p is harder to decode than 1080i? I ask because it
> >turns out that my recently-built HDTV Myth box doesn't have enough
> >horsepower to play back some OTA recorded streams.
>
> Because of the interlacing, the data stream for 1080i consists of
> only 540 lines/frame, wheres 720p is truly 720 lines/frame. So 1080i
> actually contains less detail per frame than 720p. That would explain
> the difference in file size and processing requirements.

you'd think with the number of times this comes up on this list, and
is debunked as bogus, people would stop saying it.

1080i IS NOT 540P!!!
1080i is 1920x1080, it does not contain less detail than 720p.

-- 
Steve
Before you ask, read the FAQ!
http://www.mythtv.org/wiki/index.php/Frequently_Asked_Questions
then search the Wiki, and this list,
http://www.gossamer-threads.com/lists/mythtv/
Mailinglist etiquette -
http://www.mythtv.org/wiki/index.php/Mailing_List_etiquette


More information about the mythtv-users mailing list