[mythtv-users] Playback On Intel

David Edwards david at more.fool.me.uk
Fri May 16 09:50:33 UTC 2014


On 15 May 2014 00:19, Jean-Yves Avenard <jyavenard at gmail.com> wrote:
> On 12 May 2014 01:10, David Edwards <david at more.fool.me.uk> wrote:
>> A pale cloud of decoding artefacts
>> slowly builds in shadow areas over a few seconds, then abruptly drops
>> back to black. This repeats every 10 seconds or so.

> I had this a while back with some nvidia drivers, and it's a known
> issue with the OpenGL deinterlacer.

> You should try the OpenGL Normal which uses the KERNEL 2X HW
> deinterlacer.

I only had time for a quick play last night, but sadly I confirmed
that its not the deinterlacer causing the problem. The content I've
noticed it on the most is a 24p H.264 .mkv generated using Handbrake
(High Quality Profile) from a Bluray.

As an experiment, I created a new playback profile with the
deinterfacer set to None, and the output to openglvaapi + opengl2. I
then tried switching between using the Standard decoder and the VAAPI
decoder. The problem only appeared using Standard. I tried increasing
the CPUs to no avail. I then tried turning off the Deblocking Filter.
This may be a red herring, but with the Deblocking Filter turned off
on the Standard decoder, it looked like the same problem but far far
worse - ie, the same effect but affecting the whole picture not just
shadows.

This weekend I will try to post a 30 second clip. What else could I
test to narrow down the problem?

Thanks,

David


More information about the mythtv-users mailing list