[mythtv-users] Mythfrontend idle cpu consumption help

Paul Gardiner lists at glidos.net
Fri Nov 13 09:05:04 UTC 2009


Brian J. Murrell wrote:
> On Thu, 2009-11-12 at 10:02 +0000, Paul Gardiner wrote: 
>> Yeah, I use Standard (I think it is), which is ffmpeg for decode, and
>> XV for blit. Then "Interlaced (2x)" is available.
> 
> Right.  So I experimented a bit.
> 
> I tried "Interlaced (2x)" which as you say, requires ffmpeg for decode.
> That resulted in about 60% CPU for SD playback.
> 
> I then tried "None" interlacing with XvMC decoding and CPU usage was
> half the above, at 30% for the exact same playback.
> 
> Without actually being able to try the combination of XvMC and
> "Interlaced (2x)" (as they must be incompatible I guess) it's hard to
> know which is having the dramatic CPU impact difference, but I tend to
> think XvMC vs. ffmpeg is most responsible.

That'll be the difference between the decode being done by the graphics
card with XvMC and being done on the CPU with ffmpeg. And I can quite
believe that on slower CPUs you might find XvMC more responsive.

> So I will leave things at XvMC and None for a while and see if I notice
> any artifacts.

Good plan. If you can get away with XvMC and None then that's definitely
the best thing to do; it's just that it is difficult to explain how
you are getting away with it if watching interlaced content (as I
said before - artifacts are usually particularly visible during
rolling credits).

For me the sudden really high quality picture only happened when I got
all the settings right at once. Until then it still looked pretty good
and it was only because I also had older DVRs that I knew I wasn't
getting perfection.

Besides scaling, one other thing that could hide combing (hence letting
you get away with deinterlacer None), but soften the picture is
having TvDeflicker set to other than 0 on the TV out setting.

Paul.



More information about the mythtv-users mailing list