[mythtv-users] XvMC question

Yeechang Lee ylee at pobox.com
Tue Jul 29 12:13:38 UTC 2008


Please don't top post.

Now, onto the show:

Scott Kidder <kidder.scott at gmail.com> says:
> Using XvMC made for an interesting challenge: why do something in
> software when it can done much more efficiently in hardware.

I suppose this is in many ways the counterpart to the software RAID
versus hardware RAID debate that occasionally pops up here. I am
firmly on the software side of that debate, too, and for the same
reason: The supposedly better-optimized dedicated hardware is, in
practice, generally inferior to putting the more-powerful
general-purpose CPU to work, in terms of both performance and
functionality.[1]

> Yes, I see about 80% CPU usage when playing 1080i content.  But
> power consumption is also significantly higher for the machine:
> around 160 watts, compared to 110 watts when using XvMC.

Very fair point, but . . .

> Also, lowering the load on the machine introduces the possibility of
> running some other services on the machine so that it isn't strictly
> a MythTV box.

. . . Again, in my case, I see lower CPU usage without XvMC. Surprised
me, too. A nice little bonus on top of the inherent advantage of a
color OSD that doesn't stutter video and audio. I haven't actually
used the Kill-a-Watt I bought a while back to see how this translates
into power usage but really should sometime.

> I noticed that you have separate frontend & backend machines.

Yes and no; the frontend also is a slave backend, and also does
commflagging.

> I'm using a single machine for frontend and backend.  Playback of
> 1080i content get murdered when commercial flagging begins.

My frontend box was also my sole backend for a year, and the only time
I noticed playback impact--again, never using XvMC except out of
curiosity--was when I transcoded. A single commflag job never impacted
playback.

I can't explain the disparity between our experiences on similar
hardware. I use PCI Express, but that shouldn't matter given your
using an AGP card.

[1] This would be different were Nvidia's hardware-based h.264
decoding features available on Linux, and likely will be once the
open-source drivers for Intel's video chipsets achieve similar
functionality. Either would handily solve the issues I raise at
<URL:http://www.gossamer-threads.com/lists/mythtv/users/315909#315909>.

-- 
Frontend:		P4 3.0GHz, 1.5TB software RAID 5 array
Backend:		Quad-core Xeon 1.6GHz, 6.6TB sw RAID 6
Video inputs:		Four high-definition over FireWire/OTA
Accessories:		47" 1080p LCD, 5.1 digital, and MX-600


More information about the mythtv-users mailing list