[mythtv] 10 times better H.265/HEVC decoding performance with WinTV compared to MythTV. Why?

Bjoern Voigt bjoernv at arcor.de
Wed Nov 1 22:53:59 UTC 2017


Hello Devs,

as you may know Germany is moving from the old DVB T standard to DVB T2
HD standard for terrestrial TV. I am one of the users, who followed the
MythTV development in this area. Big cities like Berlin use the new
standard since 6 month.

Watching unencrypted DVB T2 HD TV is possible with MythTV 0.28 and 29
with some bugs, which are more or less important to the users, e.g.:

  * HEVC/H265 Parser missed for Recordings
    https://code.mythtv.org/trac/ticket/12993
  * encrypted channels can only be decoded with the official USB device
    from Freenet TV (software is only available for Windows and Mac; no
    hardware acceleration currently) and with a per-device Freenet TV
    subscription
  * VDPAU HEVC / h265 decode support broken if deinterlacing is on
    https://code.mythtv.org/trac/ticket/12992
  * em28xx DVB-T2 USB card can't be unplugged and fails to hibernate/resume
    https://bugzilla.kernel.org/show_bug.cgi?id=194171

But the main issue is, that most users do not have machines with enough
power to decode H.265/HEVC streams. (The users of the official Freenet
USB device especially complain about this.)

That's why I did some research about the HEVC decoding performance on my
desktop machine. The results are surprising.

Setup:

  * 4 cores i5 CPU (Intel i5 750 2.67GHz)
  * 16 GB RAM
  * openSUSE Tumbleweed x86_64 rolling release distribution
  * Nvidia Geforce GTX 750 (Maxwell architecture, H.264 hardware
    decoder, NO H.265 hardware decoder)
  * USB TV device: Hauppauge WinTV soloHD
  * Driver: Linux: Nvidia driver 384.90; Windows: Nvidia driver 388.00

I tried different decoding options and watched the CPU workload (with
Windows task manager and "top" on Linux)

Here are the results (best results on top)

 1. WinTV 8.5, Mainconcept HEVC Video Decoder (recommended from
    Hauppauge for DVD T2 HD): ~ 8.6 % CPU
 2. WinTV 8.5, Lav Filters 0.70 with DXVA2 (native) HEVC hardware
    acceleration: ~ 9.6 % CPU
 3. WinTV 8.5, Lav Filters 0.70 with Nvidia CUVID HEVC hardware
    acceleration: ~ 41,1 % CPU
 4. Kodi Frontend with MythTV 29 backend, VDPAU hardware acceleration
    (but fallback to OpenGL): ~47 % CPU
 5. MythTV 29, OpenGL frontend: ~ 93 % CPU

If you are interested, I can upload the screenshots and text logs.

My questions:

Why is the MythTV frontend HEVC decoding performance so bad (compared
with WinTV, but also compared with Kodi)?

Is there something we can do to increase the MythTV frontend performance?

(I already tried to activate some Nvidia related hardware acceleration
options in MythTV. Unfortunately all of the tested options cause
compiling problems with the current CUDA release 9.0. I tested
--enable-cuda, --enable-cuvid, --enable-nvenc and --enable-libnpp.
FFmpeg from GIT works with CUDA 9.0, but the FFmpeg GIT integration into
MythTV 29 is nothing, what I can do in some hours.)

Are there any plans to optionally support commercial decoders like
Mainconcept HEVC Video decoder? Or is it possible to take some
patent-free ideas of such decoders? They offer a Linux SDK. But of
course, the SDK is not free. See
https://www.mainconcept.com/eu/products/for-developers/video/hevch265.html

Would it be a good idea to upgrade the Nvidia card to a graphics card
with HEVC decoding support (e.g. GTX 1030 or GTX 1050)? I mean, is the
VDPAU HEVC support stable?

Greetings,
Björn


More information about the mythtv-dev mailing list