[mythtv] CUDA Code for nvidia cards

Raymond Wagner raymond at wagnerrp.com
Wed Oct 29 03:30:49 UTC 2008

Daniel Kristjansson wrote:
> On Tue, 2008-10-28 at 16:41 -0400, R. G. Newbury wrote:
>> I've just been reading a paean of praise in Linux Journal for the CUDA 
>> code which apparently is the neatest thing since sliced bread, and grows 
>>   hair on your *head* (since most of us are old enough to consider hair 
>> -there- as a ...waste). Apparently, CUDA unleashes the power of the GPU 
>> to speed up video/ray tracing/modelling etc. etc.
>> I assume, since the article does not explain, that this involves writing 
>>   code in a particular manner that it is diverted from the CPU to the 
>> GPU for calculation purposes (and presumably returned pre-digested for 
>> output).
>> Does anyone have any idea how this actually works, and any idea whether 
>> CUDA style code could possibly be of any use in mythtv?
> It's really nothing new, I wrote some fancy rendering code using that
> stuff in the late 1990's, but of course back then it only worked on
> 5xxx nVidia cards and the pseudo C often failed and had to be replaced
> with asm. The problem is not that the hardware and drivers aren't there
> to do the decoding, it's that no one has the both the motivation and
> time to sit down and write a GPL mpeg-4 avc decoder that runs on the
> GPU. It is not a small project, and you don't see much performance
> payback until you've completed almost the entire task.
> -- Daniel
> _______________________________________________
> mythtv-dev mailing list
> mythtv-dev at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-dev
There is a 2008 GSoC project called Gallium3D with the intent of 
developing a XvMC driver running on the programmable shaders (rather 
than CUDA).  Currently it is capable of handling motion compensation on 
GF6200 cards.

More information about the mythtv-dev mailing list