[mythtv] Deinterlacer settings
David Engel
david at istwok.net
Wed Feb 20 15:35:56 UTC 2019
On Wed, Feb 20, 2019 at 09:23:06AM +0000, Mark Kendall wrote:
> On Tue, 19 Feb 2019 at 15:59, David Engel <david at istwok.net> wrote:
> > Mark, there is an issue with our current mediacodec/opengl
> > implementation. With interlaced content, one of the first few frames
> > after a skip (including skips for ff/rew) gets corrupted and includes
> > image data from two different frames. This problem reportedly doesn't
> > occur when using Android's natvice Surface format for rendering. Are
> > you aware of that issue and done anything about it? I've been meaning
> > to try the render branch to see if it still exists.
I watched a half hour, interlaced program last night with mediacodec
and didn't notice the problem. It's possible the problem was fixed
with a Shield update. I'll keep looking for it.
> I've not seen this but will have a look. That sounds like a reference
> frame issue with the active deinterlacer. From our perspective, that
> would only happen if using the kernel deinterlacer. Thinking about it,
> we don't reset the reference frames after a seek (which we do with
> VDPAU) - so there is something to look at there. If it is the
> mediacodec deinterlacer doing the same, then I'm not sure how we deal
> with that. Perhaps we need to flush the decoder properly (though this
> should already be happening).
Our deinterlacing should not be in play at all. The frames are
already deinterlaced when we get them back from mediacodec/ffmpeg. As
I understand it, the problem occurs when we convert the returned,
Android surface to an opengl texture (pardon my likely inexact
terminology). Applications which render the surface directly without
converting it to opengl don't exhibit the problem.
> Also, have you
> > dont anything else in the mediacodec area yet to avoid extra copying?
>
> I've been looking at this but am still largely in the dark. The FFmpeg
> mediacodec code accepts a single 'surface' as an option when it is
> configured (which we obviously do not currently use). There is no
> documentation on how to use it. Looking at the kodi (or maybe mpv?)
> code - there is one implementation that uses this configuration - but
> the 'surface' they supply is the window id and the implementation is
> called 'embedded' - so i can only assume that this somehow renders
> directly to the screen. Not ideal but could still be workable if we
> can still render the OSD etc on top. Other implementations do not use
> FFmpeg. Based on a comment in the libqtav code, there is clearly the
> possibility of integrating mediacodec directly into OpenGL - but that
> part of the code is closed source and not available:( and it is
> unclear whether FFmpeg is used.
Aman Gupta <aman at tmm1.net> has been very helpful to us in the past
regarding mediacodec and ffmpeg. He's the main mediacodec guy at
ffmpeg. He's also one of the authors of the Channels app for Android.
He should be able to answer any questions like this that you have.
David
--
David Engel
david at istwok.net
More information about the mythtv-dev
mailing list