[mythtv-users] 720p harder to decode than 1080i?

Brad DerManouelian myth at dermanouelian.com
Mon Nov 20 17:43:12 UTC 2006


On Nov 20, 2006, at 9:27 AM, Mark Lehrer wrote:

>
>>> I am assuming 720p is harder to decode than 1080i? I ask because it
>>> turns out that my recently-built HDTV Myth box doesn't have enough
>>> horsepower to play back some OTA recorded streams.
>
>> Because of the interlacing, the data stream for 1080i consists of
>> only 540 lines/frame, wheres 720p is truly 720 lines/frame. So 1080i
>> actually contains less detail per frame than 720p. That would
>> explain the difference in file size and processing requirements.
>
> Unless of course you are deinterlacing with Bob 2x, which I'll bet a
> majority of HDTV users should be doing... there aren't a lot of truly
> interlaced displays being sold anymore.
>
> The simple rule is - if you are doing hi def, you really should go
> with a dual core processor because both X and MythTV are quite busy.
> Maybe you can squeak by with a single core if you are watching 24fps
> content, but these days it just isn't worth it.
>
> Mark

That's some good advice. I went from a "mostly happy" AMD 3200+ to a  
completely thrilled AMD X2 4400+. I don't use xvmc any more due to  
the Gray OSD, bouncing image, etc. I use Standard decoder with Linear  
deinterlacing. Watching HD content brings my X and mythfrontend  
processes to about 100-105%, but the picture is unbeatable.


More information about the mythtv-users mailing list