[mythtv-users] OpenGL render problems with radeon and 0.24.1

Mark Kendall mark.kendall at gmail.com
Sun Jun 12 11:52:39 UTC 2011


On 10 June 2011 23:49, dargllun <dargllun at googlemail.com> wrote:
> On 10.06.2011 03:22, Mark Kendall wrote:
>> is not yet needed by default). On my master backend, using the very
>> latest radeon version, everything I could test with it MythTV wise
>> worked as expected. I then forced it to use the latest builds from the
>> xorg-edgers ppa and enabled Gallium - and again, no obvious issues.
>>
>> So all told, while I was expecting a day of grief, I ended up being
>> pleasantly surprised:)
> I also gave it another go last night, and had to confirm that the
> rendering is *very* much corrupted at this point. I have to say however
> that I'm using the latest&greatest xorg-edgers packages. There were a
> great many r600g commits during the last week which may have broken some
> things.
>
> Can you describe what exactly "pleasantly surprised" means?

It worked as I expected. No visual corruption in the main UI or when
playing video. Though both glxgears and nexuiz were pretty broken :)

>Did you have a stable real-time rendering? Did you you any deinterlacers? Leaving the
> massive corruption aside for a moment, which may have other causes, I
> seem to get good performance without deint, but a bad one when using
> kernel(2x). (I'm using this one for Xv which, there, gives great results).

You have to bear in mind that the single biggest bottleneck for OpenGL
deinterlacing performance is texture sampling.

When no deinterlacing is taking place, there is one texture sample per
pixel (we do some packing of the video frame data in software before
sending it to the gpu to avoid sampling for the luma and both
chromas).

When you are using the OpenGL kernel deinterlacer, there are 8 texture
samples for each pixel in the frame. Depending on your gpu, gpu memory
type and speed, video resolution etc, this may simply be too slow. You
may not be using the cpu to perform deinterlacing, but the gpu still
has a finite time to complete the process before the frame needs to be
displayed.

If the software kernel deint works for you, then just use that with
OpenGL (unlike vdpau, you can use a cpu based deinterlacer with OpenGL
rendering). As already mentioned, there is a performance gap between
straight XVideo and OpenGL which may limit your choice of software
deinterlacer.

regards

Mark


More information about the mythtv-users mailing list