[mythtv-users] Mythfrontend idle cpu consumption help

Johnny jarpublic at gmail.com
Wed Nov 11 16:43:15 UTC 2009


> And deinterlacing your input content.  Presumably the original content
> is interlaced if you are deinterlacing and considering it is probably
> being captured from a source that is destined for an interlaced output
> device (i.e. an NTSC broadcast).

Again this is off topic, maybe start another thread. VDPAU Advanced 2x
is the best looking picture by far that I have ever got on my TV. It
looks better than straight from the cable. It doesn't make sense to
me, but I am very happy with it.

Now back to the CPU usage. What is the right approach to understanding
this or to giving the devs something informative and actionable. I
have tried strace as was suggested. But that just shows system calls,
I don't know how helpful that is. I get this over and over again:

clock_gettime(CLOCK_MONOTONIC, {3769, 150476635}) = 0
read(8, 0x8b3bd00, 4096)                = -1 EAGAIN (Resource
temporarily unavailable)
gettimeofday({1257956725, 614018}, NULL) = 0
clock_gettime(CLOCK_MONOTONIC, {3769, 150674908}) = 0
read(8, 0x8b3bd00, 4096)                = -1 EAGAIN (Resource
temporarily unavailable)
read(17, 0x8b72f60, 4096)               = -1 EAGAIN (Resource
temporarily unavailable)
gettimeofday({1257956725, 614218}, NULL) = 0

And when I look at the count (strace -c), it looks like it spends most
of its CPU time in those read() calls that are failing. However, I
haven't done this kind of debugging before and I don't want to waste
time if this isn't even relevent.


More information about the mythtv-users mailing list