[mythtv-users] Experiences with XvMC

Scott catfather at donpoo.net
Sun Jul 31 13:16:14 EDT 2005

Having been on the list for some time I was familiar with nvidia +  
xvmc combination and wanted to try it out. I picked up an evga  
GeForce FX 5200 PCI adapter and stuck it into my SIS based P4  
Northwood 2.4GHz Pundit system. (Note: not the Pundit-R) Setup was  
done under Gentoo 2005.0 using all stable packages with kernel 2.6.12- 
r4. I wanted to put down some notes for others who might be  
considering the same.

My goal was to evaluate the xvmc + nvidia playback quality for future  
HDTV use. For playback I used xine-lib 1.1.0 and xine-ui 0.99.4 along  
with a few 1920x1080i and 1280x720p clips. I also used some standard  
DVD video to compare xv with xvmc processing. My viewing was focused  
around deinterlacing and artifacts that might be seen from post  
processing. For a display I used DVI out at 1280 x 720 to my  
Panasonic PT-L500U projector.

Setup of nvidia + xvmc on Gentoo is well documented and went off  
without a hitch using nvidia-kernel 1.0.7667. Over all I was  
impressed with the results. Using xvmc I was able to easily playback  
1080i clips with about 35% cpu use. Considering the low end PCI bus  
on the Pundit system board and the low end GPU on the FX 5200.

I had two issues that for me are critical.

When using xvmc with Xine none of the OSD overlays would work. This  
meant no on screen information, subtitles, or closed captions during  
playback. I didn't investigate this to much however I think it could  
be worked around. One possibility seems to be with the Xine xxmc  

XvMC appears to do very simple one-field deinterlacing in hardware.  
Xine 1.1.0 has experimental XvMC bob deinterlace support which I  
enabled and saw some improvements. I compared various DVD (480i  
content) scenes played via Xv with Xine deinterlace post processing  
and XvMC with hardware deinterlace. It was pretty clear to me that Xv  
with deinterlace post processing was producing a much better picture  
than XvMC using hardware deinterlacing.

Based on my testing I think using nvidia + xvmc could be acceptable  
when dealing with HDTV streams but should be avoided for standard  
NTSC 480i content or where ever else it isn't absolutely needed. The  
best solution, from an image quality standpoint, would be to avoid  
XvMC and use a powerful system to handle the HDTV content decoding  
and playback. Obviously that solution brings with a host of other  
problems dealing with heat and noise.

Right now I'm unsure of which direction I'll go in. I know I will be  
scraping my pundit and setting up a full FE/BE in a nice HTPC case  
like the ones Silverstone makes. The cheap solution, for me, is to  
get a socket 478 ATX board that supports my existing P4, memory, and  
other hardware then add in a quality Nvidia AGP based adapter.  
Considering the prices of nice looking HTPC cases and silent power  
supplies this might be the way to go as an interim step.

Scott <catfather at donpoo.net>
AIM: BlueCame1

More information about the mythtv-users mailing list