[mythtv-users] Pixelation of reds using Intel/OpenGL

David Edwards david at more.fool.me.uk
Mon Dec 2 13:23:50 UTC 2013


Hi,

I recently replaced the Ion-based motherboard in my FE due to a H/W
failure. I am now experimenting with using Intel HD4600 integrated graphics
on an i5 4570S.

I found the Bob deinterlacer supported by the VAAPI driver unwatchable (I
am in the UK and the kids watch a lot of interlaced SD content), so until
the VAAPI driver supports a better deinterlacer (I think it will do very
soon), I thought I could just use the CPU to do everything since it has
enough oomph.

So, I've configured playback to use the OpenGL High Quality profile, using
the Greedy High Motion deinterlacer. I selected OpenGL because it seemed to
be the only profile that scaled the OSD, etc, consistently. It seems to do
a good job on HD content.

However, with this playback profile, I've noticed that on SD content, areas
of saturated red look badly pixelated. I know that SD content is being
scaled up, but this is definitely more than just a scaling issue. Other
colours and less saturated reds do not appear to be affected, and nor does
HD content.

Anyone got an idea what the problem is, or how to fix it?

David
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.mythtv.org/pipermail/mythtv-users/attachments/20131202/eee7cc36/attachment.html>


More information about the mythtv-users mailing list