[mythtv] Editor step forward by keyframe needs 2 keyclicks with nvdec in UK SD recordings
Mark Kendall
mark.kendall at gmail.com
Thu Feb 20 09:50:08 UTC 2020
On Wed, 19 Feb 2020 at 17:17, John Pilkington <johnpilk222 at gmail.com> wrote:
> Thanks for restoring the edit marker :-)
Heh - I would have done it ages ago but didn't realise Jonatan had
spotted the issue:)
> Minor observations:
>
> 'Standard' playback of SD DVB-T mpeg2video is fine but nearing the limit
> of my 2.6 GHz core2duo with 2x CPU yadif.
Assuming this is one of your machines with the opengl problem, you're
hitting a double whammy of cpu load on that box.
Firstly, OpenGL is emulated in software. I still have no better idea
of what is happening - but have you considered either compiling from
source yourself or tying a different distro? - Ubuntu would be the
obvious choice. The former would eliminate the build as an issue and
the latter any oddities with your distro (Scientific Linux iirc -
which I think is now deceased?)
Secondly, the current CPU deinterlacer selection doesn't offer a
middle ground performance wise. The bob/onefield filter is very fast,
but very 'bobby', whereas both of the libavfilter options (yadif and
bwdif) are high quality but very CPU intensive. libavfilter doesn't
offer anything else that I can see that meets our needs - those are
currently the only deinterlacers that can be run at double rate.
I'm thinking of either trying to 'fudge' a medium quality deinterlacer
with what is available in libavfilter and libswscale - or just writing
a new one...
> nVdec with h264 is still giving decoding artefacts with static pictures,
> and with mpeg2video the smaller steps in the editor are often not as
> expected. It looks as if the most recent commits are for vaapi.
I spent some time testing editing, single frame seeks etc yesterday on
various machines. As far as I can tell, with the latest fix to the
VAAPI deinterlacers, the behaviour is consistent across software
decode with any deinterlacer (CPU and GLSL), VDPAU and VAAPI.
NVDEC is the problem. There is a small issue, as you noted, of the
first forward seek not being seen on screen, which should be fixable
and is a result of the changes I made to handle stream changes. The
bigger issue is that when NVDEC deinterlacing is enabled, short
backwards seeks are totally innacurate. This is an issue with how we
handle the increased framerate seen when NVDEC does the deinterlacing.
I'm considering switching to use the cuda-yadif deinterlacing filter -
which will eliminate the framerate issue in most cases and give us
full control of deinterlacing. Currently we can only enable NVDEC when
the decoder is opened.
> On my later i3 box, with an earlier Myth build and an nVidia GT 710
> card (low-end but uses the current driver) I got noticeably better
> playback of HD content using standard decoding. It shows during
> panning. But that was with 4 cores at ~3 GHz. I don't know what the
> SoCs are doing...
A GT 710 may be low end by modern standards - but should be more than
capable of handling not only VDPAU (and NVDEC) decoding, but the GLSL
filters should be fine. No need to use CPU deinterlacing at all - or
even CPU decoding. One of my test setups is a 10 year old GT210 paired
with a 12 year old CPU - no playback issues with any UK broadcast
material.
Regards
Mark
More information about the mythtv-dev
mailing list