[mythtv] Deinterlacer settings

Peter Bennett pb.mythtv at gmail.com
Sun Feb 17 17:56:59 UTC 2019



On 2/16/19 5:26 AM, Mark Kendall wrote:
> I'd like to propose a change to how deinterlacers are configured as
> the current setup has become unwieldy to the extent that it is almost
> unmaintainable.
>
> Proposal:
>
> - remove existing specific, named deinterlacer selections for main and fallback.
>
> - replace with something along the lines of:
> - new settings:
>          "Deinterlacer quality (normal)": None, Fast, Medium, High, Advanced
>                  "Prefer GPU deinterlacers": Yes/No
>                  "Prefer GPU driver deinterlacers": Yes/No
>          "Deinterlacer quality (double rate):" None, Fast, Medium, High, Advanced
>                  "Prefer GPU deinterlacers": Yes/No
>                  "Prefer GPU driver deinterlacers": Yes/No
>
> I believe that gives the same flexiblity as the current settings
> without tying the code to specific named deinterlacers. Remember these
> are per profile settings.
>
> Selecting 'Prefer GPU deinterlacers' would obviously select shader
> based deints over CPU based and 'GPU driver deinterlacers' would
> prefer VDPAU/VAAPI etc over all others. The code can then make some
> informed decisions on what deinterlacer to use and at what stage in
> the decode to presentation process, as well as falling back where
> needed.
>
> With this setup, under the hood, the deinterlacer selections for Fast,
> Medium, High and Advanced would look something like:
>
> CPU - Onefield, LInear blend, Kernel, Yadif
> OpenGL/D3D - onefield, linear blend, kernel, new motion adaptive shader
> VAAPI - Bob, Weave, Motion adaptive, Motion compensated
> VDPAU - Bob, Temporal, Spatial, ???
> OpenMax - linedouble, fast, advanced, ??
>
> Where onefield and bob are interchangeable as 1x and 2x versions.
>
> Background
>
> We now have a range of deinterlacer options, not just in terms of
> CPU/GPU/driver based but also in terms of at what stage deinterlacing
> occurs.
>
> As an example, with VAAPI decode only (VAAPI2), deinterlacing could
> occur in the decoder using either CPU or VAAPI based deinterlacers or
> at playback using the CPU or GLSL shaders (you could even use VAAPI
> again at this stage). With the new VAAPI zero copy code in the render
> branch, the current setup cannot cope with a VAAPI profile that
> expects to use VAAPI deinterlacer names but OpenGLVideo is actually
> presented with raw video frames that (currently) need to pass through
> the GLSL shaders.
>
> The current code is inflexible - especially with its use of strings to
> explicitly identify each deinterlacer - and has started to break. A
> much simpler and more flexible approach would be to use a simple
> flag/enum that encapsulates the user preferences.
>
> Thoughts welcome!
>
> Regards
> Mark
> _______________________________________________
>
Hi Mark

Question - What are GPU deinterlacers vs GPU driver deinterlacers?

It sounds good. Just a few things to bear in mind that you may not be 
aware of -

There are deinterlacers  that work with the decoder in the case of 
VAAPI2 and NVDEC (i.e. the video is deinterlaced before we get it from 
the decoder.

In the case of OpenMAX the deinterlacer can work with the decoder or the 
renderer, currently is with the renderer.

In the case of mediacodec on NVidia Shield the video is automatically 
deinterlaced and frame doubled in the decoder without us having any 
control over it. Mediacodec on fire stick presents video still 
interlaced but tells us it is progressive, also we have no control over it.

Peter


More information about the mythtv-dev mailing list