[mythtv-users] Nvidia CUDA / HD decoding

Blammo blammo.doh at gmail.com
Sun Feb 18 09:33:17 UTC 2007

So now that a lot more 1080p demo content is available, and I've had a
chance to try it out on various CPU's, I've seen what high
requirements realtime playback of H.264/1080p content is.

In case you haven't seen the news, press releases, etc, Nvidia has
come out with a method of compiling code that will run on either GPU,
CPU, or both, depending on what's available, to accelerate certain

(Press release here : http://www.nvidia.com/object/IO_39918.html )

Right now the requirements are fairly narrow: Quote: "CUDA requires
8-Series graphics card and a specific driver for this beta."

However, there's been talk of opening up to previous generation cards,
etc. Which got me to thinking.... the single most CPU intensive thing
a myth box is faced with right now is HD (H.264/etc) software
decoding... wouldn't it be nifty if you could offload some of that to
the GPU.... Yes memory bandwidth is working hard, but without some
sort of decoding offloading, the GPU should have some headroom left.

Short of doing something like that, are there any other optimizations
that come to mind, making that type of playback easier? XvMC can't do
anything for, H.264 for example, as far as I can tell...

For what it's worth, I can play back some of the most intensive 1080p
content I've found (several of the BBC demos) on a Athlon 64 x2
@2.9ghz, at about 80% CPU. I'd imagine that's about 50% more
horsepower than most people are packing in their front ends right
now.. certain more than my main frontend.


More information about the mythtv-users mailing list