[mythtv] AVSync2 Refinements

Mark Kendall mark.kendall at gmail.com
Wed Dec 18 08:57:59 UTC 2019

On Tue, 17 Dec 2019 at 01:09, Tim Pletcher <pletchtd at gmail.com> wrote:
> For me, the 0.4 gain with a 0.6 filter coefficient causes jumps in video playback on my low end apollo lake series frontends using VAAPI.  Without sufficient low pass filtering, the adjustments are too much response to the fairly regular larger spikes in sync measurement when using deinterlacing (https://imgur.com/YpmvIG7).
> Adjustments of more than roughly 5 ms with any regularity do not provide smooth video playback and this is particularly highlighted when watching programs with lots of motion such as sports. I provided an example plot earlier in this thread with a gain of 0.4 and a filter coefficient of 0.9 and those settings provide satisfactory video playback on these apollo lake machines.

I've pushed an update to use 0.4 and 0.9.

I'm not seeing any issues with regular video playback on intel
(vaapi/software), nvidia (nvdec,vdpau,software) and Pi4.

There are some issues - and I'll caveat these by saying I'm not sure
whether they result from the recent avsync changes, from the long
standing avsync2 code or other changes I've made recently.

Firstly - playback of audio only files seems a little broken.
Particularly MHEG/DVB radio and seeking. This may be related to a
small fix I added for MHEG only streams - I need to check.

Secondly - video only streams sometimes struggle to settle down at the
start of playback - though I'm starting to wonder whether this is
purely a Raspberry Pi issue.

Thirdly - there is some very strange behaviour on OSX under certain
conditions. Sometimes the video continually speeds up and slows down.
Piotr reported this the other day and I can reproduce on my (very) old
macbook. For me I only see it when using the inbuilt display and
certain interlaced videos using double rate deinterlacing. If I
connect to an external display, everything is fine. Due to the
limitations of the macbook, I can't test heavily before I max it out -
and frames are dropped naturally.

The oddity with OSX internal displays is that they do not have a fixed
refresh rate. Furthermore, vsync always falls back to busy wait - as
there is no DRM. So the rendering code will not block in the same way
when displaying a frame. I presume it will rate limit any refresh
rates to the overall max for the display (presumably 60Hz) - but
otherwise it doesn't appear to wait for OpenGL vsync (but it isn't
tearing). I can't get my Christmas limited brain power around what
might be happening. I should add that CPU load is low and OSX has no
OpenGL performance monitors - so cannot check the GPU.

Anyone have any ideas?


More information about the mythtv-dev mailing list