[mythtv-users] Looking for new DVB-S+CAM based backend hardware recommendations

Phil Jordan lists at philjordan.eu
Wed Aug 29 22:50:37 UTC 2007


Michael T. Dean wrote:
> On 08/29/2007 03:32 PM, Phil Jordan wrote:
>> Seriously though, I do think it's very feasible even without special
>> driver support as long as you've got access to pixel+vertex shaders and
>> render-to-texture via OpenGL. Which we do, of course, for
>> nvidia/fglrx/intel.
> 
> Right.  And, this is the way to go (even if NVIDIA releases some Linux
> "PureVideo" API) because it's standards-based.

Agreed. It'd be years (if ever) before NVIDIA, AMD, and Intel agreed on
an API anyway. I've not worked with XvMC myself but as far as I can tell
it's not made much progress in recent years, apart from fragmented
non-standard extensions.

It's bad enough having to live with nvidia/fglrx binary drivers as it
is, and this sort of thing doesn't exactly have the market inertia of
OpenGL to keep everything standardised.

>>  The main work would be integrating with
>> MythTV/ffmpeg and tuning it to be fast enough on lower-end GPUs and the
>> various quirky drivers.
> 
> Also, lower-end GPU's are not likely to support complex-enough shader
> programs (basically, you'd probably need something like a GF6800 or
> better, though I wouldn't be surprised if the 6800 is even too
> limited-

Well, the fillrate and memory bandwidth of even older mid-range to
high-end chips is quite remarkable. I'd be surprised if the later stages
of the decoding process -- deblocking, inter-frame prediction, followed
by YUV->RGB, deinterlacing, scaling -- couldn't be done on a 2.0 shader
card. The harder stuff could just be disabled for cards that can't
handle it; I'd estimate that just those last stages offloaded onto the
GPU should free up the CPU substantially.
That said, I've never implemented a h264 deblocking filter myself,
shader 2.0 or otherwise; neither have I profiled ffmpeg's h264 decoder,
so I may be wildly wrong here. :)

In any case I'm probably going too far off topic for mythtv-users now,
and certainly way off topic to this thread's original purpose. Now I
just need to get myself some time and implement all of this stuff...

>-after all, it took NVIDIA until after they released the 7800 to
> demonstrate that it could be done on a 6800, so they may have taken some
> "shortcuts" when decoding with that GPU).

Well, if I remember correctly, from a 3D rendering/OpenGL point of view
the only difference between a GeForce 7 and a GeForce 6 is anti-aliasing
transparent pixels, aside from the obvious speed enhancements. It's
quite possible that they changed some silicon that is invisible and
inaccessible at the OpenGL level. (it wouldn't surprise me if they'd
fixed some severe hardware bug; working with game console 3D hardware at
a much lower level has shown me how worry-free life is with OpenGL in
comparison :) )

~phil


More information about the mythtv-users mailing list