[mythtv] Kernel 2X HW-GL Deinterlacing

David Engel david at istwok.net
Fri Nov 23 21:53:22 UTC 2018

This issue was discussed on IRC some weeks ago but never resolved.  I
thought I'd bring it up here in hopes of resolving it.

On the Nvidia Shield, the kernel 2X HW-GL deinterlacer looks terrible
by default.  However, after explicitly setting the scan type to
"interlaced (reversed)", it clears up very nicely.  I don't see this
same behavior on Linux.  In fact, I don't see any difference among
the detect, interlaced (normal) and interlaced (reversed) scan types
on Linux but my vision isn't the best.

Can anyone else confirm my findings, particularly on Linux?  More
importantly, can anyone explain the observed behavior?  I know one
difference is the Shield uses OpenGL ES and Linux uses full OpenGL.
Does the Raspberry Pi use OpenGL ES, and if so, how does kernel 2x
HW-GL perform on it?

This is of interest because, in theory, kernel deinterlacing should be
better than linear blend deinterlacing.  Getting it working correctly
without having to set the scan type manually every time would be a
nice improvement.

David Engel
david at istwok.net

More information about the mythtv-dev mailing list