[mythtv-users] Pixelation of reds using Intel/OpenGL

David Litchman david.litchman at gmail.com
Mon Dec 2 21:28:03 UTC 2013


On 12/2/2013 8:23 AM, David Edwards wrote:
> Hi,
>
> I recently replaced the Ion-based motherboard in my FE due to a H/W 
> failure. I am now experimenting with using Intel HD4600 integrated 
> graphics on an i5 4570S.
>
> I found the Bob deinterlacer supported by the VAAPI driver unwatchable 
> (I am in the UK and the kids watch a lot of interlaced SD content), so 
> until the VAAPI driver supports a better deinterlacer (I think it will 
> do very soon), I thought I could just use the CPU to do everything 
> since it has enough oomph.
>
> So, I've configured playback to use the OpenGL High Quality profile, 
> using the Greedy High Motion deinterlacer. I selected OpenGL because 
> it seemed to be the only profile that scaled the OSD, etc, 
> consistently. It seems to do a good job on HD content.
>
> However, with this playback profile, I've noticed that on SD content, 
> areas of saturated red look badly pixelated. I know that SD content is 
> being scaled up, but this is definitely more than just a scaling 
> issue. Other colours and less saturated reds do not appear to be 
> affected, and nor does HD content.
>
> Anyone got an idea what the problem is, or how to fix it?
>

I'm kind of surprised no one has replied yet.  I'm certainly not the 
most knowledgeable hereabouts, but I do know what's going on with your 
video.  It's called chroma subsampling, Gooling that term will bring up 
plenty of results with in-depth explanations.  As it's an artifact of 
how the video is encoded I don't know what, if anything can be done to 
reduce the effect.



More information about the mythtv-users mailing list