[mythtv-users] Playback On Intel

Stephen Worthington stephen_agent at jsw.gen.nz
Mon May 12 01:53:56 UTC 2014


On Sun, 11 May 2014 16:10:59 +0100, you wrote:

>A while ago, I ditched my ION-based front end and replaced it with an
>Intel i5-4570S with HD4600 graphics in a passively cooled Streacom
>case. It's nice and cool, silent, and I thought the CPU should be
>plenty fast enough to do software decoding and deinterlacing if VAAPI
>doesn't work out.
>
>I'm running MythTV 0.27 on Mythbuntu 14.04, which I believe pretty
>much has the latest 2014Q1 Intel graphics stack.
>
>I'm watching on a brand new Panasonic plasma which is entirely capable
>of displaying a good picture, connected via HDMI.
>
>Unfortunately, I'm having all sorts of issues with playback quality.
>I've tried the High Quality, OpenGL High Quality and VAAPI Normal
>playback profiles, but they all have different problems.
>
>Decoding - Using High Quality and OpenGL High Quality I get problems
>in dark areas of the picture. A pale cloud of decoding artefacts
>slowly builds in shadow areas over a few seconds, then abruptly drops
>back to black. This repeats every 10 seconds or so. This does not
>happen when using VAAPI to decode.
>
>Judder - On SD MPEG-2 broadcasts there is a noticeable judder every
>second or so, particularly on horizontal pans. On HD H.264 broadcasts
>the judder is there, but less noticeable. I am in the UK, so these are
>25fps interfaced. On ripped DVDs (deinterlaced with Handbrake where
>applicable) at 25fps and Blurays at 23.976 fps playback is generally
>smooth, but there is an occasional jump. This happens on all playback
>profiles. The refresh rate on the TV appears to be correctly switching
>based on the content.
>
>Deinterlacing - High Quality (Linear Blend) is unwatchable, VAAPI
>Normal (Bob 2x) is almost okay on HD material, but poor on SD. OpenGL
>High Quality (Greedy High Motion 2x) is good.
>
>Consequentially, none of the playback profiles are perfect and I am
>starting to hate watching TV.
>
>Some comments and thoughts:
>
>1. The decoding issue seems to me to be a bug in ffmpeg. Does anyone
>have any thoughts on this?
>2. The worst of the judder can *sometimes* be fixed by running
>"DISPLAY=:0.0 xrandr" from an SSH session - ie, with no params -
>during playback. The picture jerks once and then settles to be smooth.
>It is also fixed sometimes by returning to the playback settings menu
>and then starting playback again.
>3. Motion Adaptive deinterlacing was added to the Intel graphics stack
>in 2013Q2 (see https://01.org/linuxgraphics/downloads/2013/2013q2-intel-graphics-stack-release),
>but does not appear as an option in the playback profile settings
>4. Refresh rate switching doesn't select non-integer rates even though
>they can be selected manually using "xrandr" from the command line.
>
>It seems to me that some of these problems may well be bugs; if anyone
>can suggest what I can do to aid them getting investigated or fixed,
>please let me know.
>
>Is no-one else having this sort of trouble? I'm starting to think that
>I need to return to Nvidia... but VAAPI and the Intel driver have had
>so much work done on them recently that it seems *so* close to being
>usable.
>
>David

I had various problems with the Intel driver output on my MSI GT70
laptop, which has an i73610QM CPU.  With a fast i7 CPU it should not
have any problems at all with pure software decoding, let alone with
hardware acceleration enabled, but it did.  I had to do two things to
get it to work.  The first was to use the drivers from 01.org instead
of the Mythbuntu 12.04 drivers.  That problem is likely to have gone
away with 14.04, but I am pretty sure it still exists with 12.04, and
now 01.org does not support 12.04.  So for anyone else out there
reading this, the way to get good drivers now is to upgrade 12.04
using the LTS hardware enablement stack support:

  https://wiki.ubuntu.com/Kernel/LTSEnablementStack

The second thing I needed to do was to tell the decoding to use more
than one CPU (Decoder MAX CPUs setting).  It is a long time since I
last set up Mythfrontend from scratch, but if I remember correctly, in
all of the profiles (High Quality, VAAPI and VDPAU anyway), the
default setting is only 1 CPU, which it seems is not enough.  You need
to go into each of the profiles and set it to at least 2 CPUs, and
probably to as many CPUs as you have for best decoding.


More information about the mythtv-users mailing list