[mythtv] thoughts: using XvMC as a generic mpeg2 decoder?

Daniel Kristjansson danielk at cuymedia.net
Tue Apr 24 18:58:26 UTC 2007


On Tue, 2007-04-24 at 11:15 -0700, Ben Osheroff wrote:
> The renderer on the nvidia/ati card never actually renders a frame into
> system memory,
AFAIK, yes.

>  but you can write filters in openGL that will run
> directly on the video card's GPU/memory?
Yes. This isn't as fast as rending the XvMC surface directly,
but my 6200 can handle 1080i, so long as I don't enable
the Composite extension in Xorg.

BTW This is currently broken in mythtv-vid. I haven't tracked
down the problem yet, but it was working before I merged in
the non-XvMC OpenGL renderer a couple months back.

-- Daniel



More information about the mythtv-dev mailing list