[mythtv-users] 720p harder to decode than 1080i?

Marco Nelissen marcone at xs4all.nl
Mon Nov 20 16:33:23 UTC 2006


>On 11/20/06, Mark Chang <mark.chang at gmail.com> wrote:
>> I am assuming 720p is harder to decode than 1080i? I ask because it
>> turns out that my recently-built HDTV Myth box doesn't have enough
>> horsepower to play back some OTA recorded streams.
>>
>> In particular, CBS seems to broadcast in a higher resolution than
>> other channels. The 1 hour recording of "NUMB3RS" takes 7+GB, while
>> the same length of "Desperate Housewives" takes only like 5GB. Am I
>> assuming right that different channels broadcast at different rates?
>>
>> Anyways, I tried introducing XvMC (Ubuntu Edgy + nvidia), but it
>> really only made things worse. Anyone get XvMC working in this
>> configuration (above + AMD64)? How do I verify that XvMC is "on"?
>>
>> Thanks.
>
>while they do broadcast at different rates, what your seeing is the
>file size difference between 1080i and 720p. CBS is 1080i, ABC is
>720p.
>
>1080i is "harder" to decode than 720p as it requires more processing
>for its larger frame size, 1920x1080 vs 1280x720.

It's been my experience that CPU usage for decoding mostly depends on
the compressed bitrate. Seems logical when you think about it, since
that directly determines the amount of data that the processor has to
decode every second.


More information about the mythtv-users mailing list