[mythtv] Jitter with time stretch

David Engel david at istwok.net
Tue Jul 17 15:39:40 UTC 2018


On Tue, Jul 17, 2018 at 11:05:53AM -0400, Peter Bennett wrote:
> On 07/14/2018 09:52 PM, David Engel wrote:
> > Did you try this with 1080i?  720p worked fine for me.  With 1080i,
> > however, playback doesn't actually speed up for speeds greater than
> > 1x.  It stays at 1x even though MythTV reports it being faster.  It
> > dows slow down for speeds less than 1x, though.  Strange.
> I did some timing on NULL audio and I discovered that with 1080i content,
> the method VideoOutputOpenGL::ProcessFrame takes from 13 - 15 milliseconds.
> Also sometimes VideoOutputOpenGL::PrepareFrame takes 5 milliseconds. At 60
> fps, one frame needs to be displayed every 16.6 milliseconds, so OpenGL the
> way we are using it cannot display more than 60 fps of 1920x1080 images. At
> 60fps it is running at maximum speed. Without audio there is nothing that
> will drop frames to force it to catch up. The time stretch works with
> software decoding and NULL audio, presumably because that is rendered at 30
> fps not 60.

Okay.  That's a very strange coincidence, though.  Surface rendering
should remove some of that overhead.

David
-- 
David Engel
david at istwok.net


More information about the mythtv-dev mailing list