Hardware-based MPEG Decoding (was Re: [mythtv-users] Intel vs AMD for Myth - dual core....)

Michael T. Dean mtdean at thirdcontact.com
Fri Jun 24 22:54:43 UTC 2005

Brian Foddy wrote:

> On Fri, 24 Jun 2005, Gregorio Gervasio, Jr. wrote:
>>>>>>> Brian Foddy writes: 
>> b> On Thu, 23 Jun 2005, Mudit Wahal wrote:
>>>> The HD recording is already in mpeg2. If you get a decent nvidia
>>>> graphics card with component output (gt6200, gt6600, gt6800), then
>>>> most of the hardware processing can be offloaded to the graphics card. 
>> b> Is this correct?  I've been using 5200/5600 series cards for a couple
>> b> years using XvMC.  It works and helps some but I would hardly call
>> b> it offloaded.  Does the 6200 etc cards have more mpeg2 processing
>> b> in Linux than their predicessors?  And how is it activated?
>> b> With the XvMC lib/api or is there something new?
>>        The newer cards have what nVidia calls PureVideo technology:
>> http://www.nvidia.com/page/purevideo.html
>> which looks interesting but there are no drivers nor libraries for
>> Linux.  Maybe someday ... 
> Same old story, I think for several generations Nvidia cards have had
> pretty descent mpeg2 playback capabilities that were never available
> in the Linux drivers.

Disclaimers:  I'm not a graphics or video expert.  This information is 
based on my current understanding of the state of the technology pieced 
together over several years of watching the industry, so I may be wrong 
about some/most/all of the points below, but I'm posting this in hopes 
of getting closer to the whole truth.  If I'm talking nonsense, please 
let me know because my goal is to understand...  :)

Most relatively modern GPU's--including the GF 4 and Radeon 8xxx series 
(and probably many before that)--provide hardware support for "motion 
compensation" and "inverse Discrete Cosine Transform" (iDCT) (and 
possibly "Discrete Cosine Transform" (DCT)).  Motion compensation and 
DCT, respectively, provide a means for removing temporal and spatial 
redundancy from video streams and are key components of MPEG 
compression.  (iDCT is basically decoding the DCT used for encoding).

X Windows provides support for hardware-assisted motion compensation and 
iDCT via the X-Video Motion Compensation (XvMC) API.  Therefore, using a 
card/driver/and application supporting XvMC allows you to offload a hunk 
of math from the CPU to the video card, but there's still quite a bit 
more to MPEG decoding than just these transforms.

The NVIDIA NV40 core (i.e. GeForce 6800) was the first to provide 
"hardware-based" MPEG-1/2/4 encoding/decoding and WMV9H.264 decoding 
(H.264 is one format used by both Blu-Ray and HD-DVD).  However, the 
decoding is actually implemented using the "programmable video 
processor" and the GPU's shader pipeline (basically, the NV40 was the 
first GPU to provide support for enough instructions in the "program" to 
perform this complex an algorithm with shaders).  I would assume similar 
decoding could be performed on the current generation of ATI GPU's, but 
I've never seen anything definitive (although I know the 
soon-to-be-released next-gen ATI GPU can do shader-based decoding, as 
mentioned below).

The ATI R520 is expected to contain silicon dedicated to H.264 decoding 
(which, IMHO, will be the first true "hardware-based" decoding in a 
general-purpose GPU) instead of requiring apps to use its programmable 
shaders (although the app could instead use the shaders to do the same 
job less efficiently).  However, even if ATI's Linux drivers provide 
support for its hardware H.264 decoding, TTBOMK, there is no standard 
X-Windows extension that can be used as an API, so it's likely to 
receive lukewarm support from Linux programmers.

If a card provides "hardware decoding" using programmable shaders, it's 
the responsibility of the application to "program" the shaders.  The 
OpenGL 2.0 specification provides support for programmable shaders 
through the aptly named OpenGL Shading Language (GLSL), but OpenGL 2.0 
is still very new:  NVIDIA's 7664 release was the first to add OpenGL 
2.0 support ( 
http://www.nvidia.com/object/linux_display_ia32_1.0-7664.html ).  
Therefore, it will take some time for applications to begin to support 
video decoding in shaders.  (If anyone wants to buy me a 7800GTX, I'd 
love to add (try my hand at adding?) GLSL MPEG decoding to Myth.  :)

So, basically, when you see that a GPU has hardware MPEG decoding 
support, there are a lot of shades of gray in the picture...  Many 
vendors have been promoting the MC/iDCT capabilities as "hardware MPEG 
decoding" for years.  Marketing:  telling the truth without telling the 
whole truth.


More information about the mythtv-users mailing list