[mythtv] 20180705 Android Observations
david at istwok.net
Sat Jul 7 01:13:17 UTC 2018
On Fri, Jul 06, 2018 at 04:15:29PM -0400, Peter Bennett wrote:
> On 07/06/2018 03:09 PM, David Engel wrote:
> > I always use a news program with scrolling ticker for this testing.
> > Where is the number of buffers defined? I'll try a couple of things
> > (# of buffers and 4k vs. 1080p TV) to see if I can get 1080i to be as
> > smooth as 720p.
> Frame Buffers -
> libmythtv/videoout_opengl.cpp - VideoOutputOpenGL::CreateBuffers
> The normal values - vbuffers.Init(31, true, 1, 12, 4, 2);
> In my patch - vbuffers.Init(4, true, 1, 2, 2, 1);
> For the meaning of the parameters - videobuffers.cpp - VideoBuffers::Init
> see the comments above that method.
Thanks. I will do some testing with various values this weekend.
> This is a temporary fix. Changing it here may not be the eventual solution
> because this changes it for every place OpenGL is used. Some conditional
> logic may be needed to use different settings for different cases.
> The default value of 31 buffers probably never considered the possibility of
> 4K video, and also was probably set up 10 or 15 years ago for lower powered
> machines than are currently used. The fact that it works reasonably well
> with one eighth of the number of buffers is interesting.
> I suspect that the reason for your jerkiness is that with deinterlacing the
> 1080i video becomes 60 fps. Perhaps some logic somewhere that increases the
> number of buffers depending on the fps may be useful.
> Also significant - Frontend setup - Video - Playback - General Playback -
> Audio Read Ahead. this controls buffers on the input side and will make a
> difference, especially since changing the number of Frame buffers affects
> the video pipeline. Try setting this higher to see if it helps. I have it
> set at 300 ms on the shield. Increasing this has less memory impact than
> increasing the frame buffers because this is compressed data that is being
Increasing this to 300 helped quite a bit on my 1080p TV, though, any
timestretching above 1x made it jerky again. It helped some on my 4k
TV. It fluctuated between short periods of smoothness and jerkyness.
> > You should try more testing. I had several cases where a 1 hour
> > ff/rew passed without issue. The problem is intermittent and
> > sometimes takes multiple attempts before it happens.
> Could you describe more fully what you see. You just said "playback
> timeouts" during FF.
Easy. Playback hangs and is unresponsive for about 40 seconds.
Eventually, the watch recordings screen reappears with a popup stating
that "Video frame buffering failed too many times."
> > > I am wondering what to do about the buffers. There are currently hardcoded
> > > numbers for each video output method (OpenGL, VDPAU, etc.). It is fine on
> > > most linux systems to grab 300MB for buffers, but android tends to kill the
> > > application if it thinks it is using too much memory. I think it will need
> > > some way of dynamically setting the number of buffers based on things like
> > > the amount of available memory, framerate, picture size, operating system.
> > Most Linux systems probably have more memory and/or swap space to
> > better deal with low memory issues.
> Swap space - not good. As an experiment I tried playing a 4K video on
> raspberry Pi. It immediately started swapping, the hard drive was going
> crazy and everything froze. Swapping tends to kill response times.
That wasn't a suggestion to use swap. It was merely an explanation
why desktop or HTPC Linux can cope betteer when RAM is tight. In
those cases, most anything that is not MythTV can be swapped out to
make room for MythTV when it is run. Granted, Android can suspend
apps, but I don't know the details of what it actually does.
> > > Another thing is there seems to be too much copying of frames from one place
> > > in memory to another. For multi-megabyte frames that can take a significant
> > > amount of time on a low end cpu.
> > I expect using Surface for rendering instead of OpenGL will fix that.
> I need to figure out how to do the surface stuff. It sounds like it requires
> creating a new video output method along the lines of VDPAU. (A lot of
My understanding is that VDPAU has custom code to render the OSD. If
that's correct, a Surface video output method should be simpler. I
believe the only new code would be to render just the video surfaces.
The OSD would still be rendered with OpenGL.
> I am hoping the OpenGL can give us a good start. It may be possible to
> reduce the amount of copying of frames that is being done. I will
> investigate that.
OpenGL rendering was definitely the best, first step as it didn't
require any, well, probably not much, new, rendering code. There's a
reason, though, that all of the other players have Surface rendering
as therir first and default choice. That is the output of the decoder
can be used directly without any additional copying and conversion.
david at istwok.net
More information about the mythtv-dev