[mythtv-users] Question about encoding/decoding

Ray Olszewski ray at comarre.com
Thu Nov 27 13:28:57 EST 2003


At 10:01 AM 11/27/2003 -0800, Douglas Phillipson wrote:
>When switching the encoding to MPEG4 my CPU goes to 90% during 
>capture.  But when playing the CPU is 2%.  I hav an AMD 2200+ CPU.  Is 
>there really that much difference in CPU effort between encode and 
>decode?  Or am I doing something wrong?  Is the encoder perhaps not taking 
>advantage of the hardware encoding features of my AMD?

Depending on the specifics of your circumstances, there *can* be that much 
difference between encoding and decoding. A relatively noisy source signal, 
for example, will consume more CPU than a relatively clean one, because the 
codec sees the noise as information to be captured.  Similarly, a 
letterboxed movie, where much of the screen is an unvarying black, will, 
all else equal, encode with less CPU use than its fullscreen cousin.

But no matter the details of the source, encoding is always much more 
demanding than the corresponding decoding. For example ... I'm not actually 
running Myth here at the moment, but when I did, I often saw encoding (on a 
2.7 GHz Celeron) up around 90%, while playback was well below 5%. Your 
numbers are a bit more extreme than mine, but not outside the range of 
plausibility.

I don't use AMD processors, so I cannot comment on that part of your question.





More information about the mythtv-users mailing list